Changes in students' learning skills through the first-year experience course: a case study over three years at a Japanese university

Ryo Sakurai (Ritsumeikan University–Osaka Ibaraki Campus, Ibaraki, Japan)

Journal of Applied Research in Higher Education

ISSN: 2050-7003

Article publication date: 16 March 2022

Issue publication date: 2 January 2023

802

Abstract

Purpose

This study was conducted to understand students' achievements in learning and to improve the overall curriculum of the first-year experience course.

Design/methodology/approach

In this study, a series of questionnaire-based surveys were conducted on students enrolled in the Introductory Seminar for Policy Science, a mandatory first-year experience course offered in the first semester (from April to July) at a university in Japan. The studies were conducted in 2015 (n = 29), 2016 (n = 29) and 2017 (n = 31).

Findings

Results revealed that, regardless of the year, students deepened their understanding of policy science and gained increased confidence to explain what group works and reports are throughout the semester. In addition, students' level of worry about life at the university decreased throughout the course in all three years. A stepwise multiple regression analysis (n = 84) revealed that those students who knew what policy science was (B = 0.271) and had the confidence to write their opinions in reports (B = 0.264) more likely answered that they knew what they wanted to study over four years at the university.

Originality/value

This study revealed that the mandatory first-year experience course taught by the same instructor generated similar educational effects for different students in different years. The results elucidated the progressive effects of different components of the course, eliminating possibilities of any bias or specific characteristics of a single group of students.

Keywords

Citation

Sakurai, R. (2023), "Changes in students' learning skills through the first-year experience course: a case study over three years at a Japanese university", Journal of Applied Research in Higher Education, Vol. 15 No. 1, pp. 185-198. https://doi.org/10.1108/JARHE-05-2021-0190

Publisher

:

Emerald Publishing Limited

Copyright © 2021, Ryo Sakurai

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

Course evaluation is one of the most important approaches to elucidate the effectiveness of teaching and ensure remarkable education in universities (Yamada, 2012; Freeman and Dobbins, 2013; Erikson et al., 2016). In Japan, the Central Education Committee of the Ministry of Education and Science declared that each university must ensure that students acquire the necessary knowledge and skills expected from higher education (Hamana et al., 2013). This requires universities to evaluate the effectiveness of the education they provide. Such an evaluation includes assessing what students are learning in their daily classes and how their perceptions change through university life and publicizing the results (Matsushita, 2012). This educational evaluation is important not only from the perspective of this accountability of a university but also in light of decreasing enrolments in Japanese universities (Hamana et al., 2013). Universities need to show the significance and impact of their education to attract students. However, not much research has been conducted to understand students' learning achievement and experiences by collecting objective data and their use to actively improve the courses and curricula at Japanese universities (Yamada, 2012). On recognizing the importance of this issue, various efforts have been made to understand students' learning process and outcomes, and research results under this theme are recently compiled (e.g. Yamada, 2016; Fujiki et al., 2020; Matsushita, 2020). Furthermore, conducting systematic and meaningful evaluation in higher education has been challenging worldwide, due to issues such as lack of standardized policy on course evaluation across a country and variations in practices at each institute (Freeman and Dobbins, 2013).

The College of Policy Science, Ritsumeikan University was selected for this research site since (1) it is one of the biggest private universities in Japan (with more than 30,000 undergraduate students) and (2) the college accepts students with various interests related to policy science. Therefore, the results could be generalizable to students in other universities (Sakurai et al., 2020). Although the college has nearly 30 years of history (established in 1994), limited research has been conducted on students' learning skills and changes in perceptions through its undergraduate courses. This study specifically focused on the mandatory first-year experience course, “Introductory Seminar,” offered in the first semester.

The role of the first-year experience course in supporting freshmen to smoothly transition from high school to university has been mentioned by various researchers and institutes (e.g. Chemers et al., 2001; Parker et al., 2004; Yamada, 2012; Association of First Year Experience Course, 2014). At the College of Policy Science, Ritsumeikan University, students learn the basics of policy science and acquire skills related to reading, writing, presentation and debate. These skills correspond with the contents included in the first-year experience course of universities: study skills (e.g. how to write reports and papers) and student skills (e.g. possessing positive attitudes toward learning) (Yamada, 2012). This study examined the effectiveness of the first-year experience course in terms of acquiring those skills by surveying students multiple times during a semester. Additionally, factors (e.g. students' learning skills and demographic attributes such as gender) affecting students' vision of their further study in the university were analyzed. The author hypothesized that students' learning skills would improve during the semester; furthermore, learning skills and demographic attributes affect students in terms of clearly envisioning their further study in the university. The concepts of self-efficacy and grounded theory were used to guide this research; this will be further explained in detail in the Methods section. The results of this study are expected to help teachers and university managers consider ways to evaluate and improve the programs.

Literature review

First-year experience, self-efficacy and students' learning skills

Regarding first-year experience course and/or effects of active-learning on self-efficacy in Japan, Hatano et al. (2015) found the ways in which an active learning style course could enhance students' academic achievement from the first to the third year of university education. Mori and Yamada (2009) revealed how collaborative learning in the first-year experience course could foster supportive behavior among students with different study skills. Limited research has been conducted in other countries, to understand students' self-efficacy, especially throughout the first-year experience course. However, Chemers et al. (2001) conducted a study on first-year university students in California, USA, and revealed that academic self-efficacy and optimism were related to students' academic performance and adjustment. Furthermore, Erikson et al. (2016) conducted a study on first-year students in Sweden and revealed that the course evaluation itself enabled students to develop their identities as learners by answering in interviews and reflecting on their learning. This was important to build their self-confidence during the early courses at the university. Meanwhile, Culver and Bowman (2020) analyzed large, longitudinal, multi-institutional data and revealed that first-year seminars were not effective in fostering students' academic achievement although they improved their college satisfaction.

Regarding students' learning skills and the influence of demographic attributes, a study conducted by Tekkol and Demirel (2018) revealed that female university students had significantly higher learning skills than male students, while a study conducted by Virtanen and Nevgi (2010) demonstrated that there was no gender difference in terms of the self-efficacy-related learning skills.

Overall, the literature review implies that a limited number of studies have been conducted on the effectiveness of the first-year experience course in Japan compared to other countries, such as the United States (Yamada, 2012); a similar trend can be observed in other Asian countries (Ding and Curtis, 2020).

Previous study that conducted surveys multiple times in one semester

Uemoto and Ito (2016) utilized methods similar to those used in this study and surveyed students three times a semester. They revealed that value (e.g. whether students believe that understanding the contents of study is important to them) affected emotional engagement (e.g. whether students find it fun to learn new things) at the beginning of the semester, while it encouraged self-efficacy at the end of the semester (Uemoto and Ito, 2016). Conversely, there have been limited studies on understanding the efficacy of the first-year experience course by surveying the same sample of students several times throughout the course as well as in different years toward different groups of students and conducting quantitative and qualitative analyses. Additionally, there are limited studies conducted to understand the factors (among students' learning skills and demographic attributes such as gender) affecting students' vision of their further study in the university. This study fills these gaps by elucidating the progressive effects of the components of the course, thereby eliminating possibilities of any bias or specific characteristics of a single group of students.

Previous study at the College of Policy Science, Ritsumeikan University

At the College of Policy Science, Ritsumeikan University, research on the evaluation of the first-year experience course revealed that students' understanding of policy science was enhanced significantly; furthermore, it increased their confidence in making presentations and doing group work and they could describe policy science eloquently by the end of the course (Sakurai, 2017). However, whether those learning effects were generated only for a specific year (for example, students enrolled in the course in 2015) or similar effects could be expected with different students in different years was unknown. Hence, a series of evaluation studies was conducted on the first-year experience course, “Introductory Seminar,” for three consecutive years (2015, 2016 and 2017)—each with a different batch of students (freshmen)—to determine whether similar learning effects could be found for three years and whether the education approach's validity and reliability could be shown.

Methods

Contents of the course

The Introductory Seminar at the College of Policy Science, Ritsumeikan University is offered to all freshmen. Around 360 freshmen are divided into 12 classes with about 30 students each, with 12 instructors teaching each course. One of those twelve classes taught by the author was selected for conducting the research since the author could examine the teaching materials and students' reactions and conduct a series of survey during the semester. Since approximately 360 students were randomly divided into 12 classes, it is reasonable to assume that result of this one class could represent the whole population of freshmen at the College of Policy Science to a certain degree. In addition, the fact that the same syllabus, including course contents and materials, was used for all 12 classes shows that the participating class has a similar learning environment to that of other classes.

The course generally has fifteen weeks of classes (90 min each); however, for the 2015 batch, there were only thirteen weeks of classes with 105 min for each class. The total duration of the class hours was equal (1,350 min). Students could earn two credits by taking this course. The syllabus explains that students learn about policy, including how to design it, and they are expected to acquire logical thinking, presentation and writing skills through the Introductory Seminar. This study defines all the aforementioned skills as “students' learning skills.” While there is a supplementary textbook, “Introduction of Policy Science,” which students can study to enhance their understanding, each instructor has the authority to decide the detailed contents for their classes. In the classes conducted by the author, the same contents were taught in all three years of the study period, thus eliminating content as a confounding factor. In the first few weeks, concepts of policy science were taught; in the next several weeks, concepts of group work, academic reading and writing were taught; and in the final weeks, writing skills were focused on, concepts of debate and workshop were taught, and actual debate and workshop took place (Table 1).

Survey instruments

In previous studies, two types of evaluation, direct and indirect, were mainly used to measure the effects of the program. Direct evaluation obtained information about the knowledge and skills students acquired through the program (e.g. examination report and presentation), while indirect evaluation—conducted using a survey and interview—was for students' self-report (Kawanabe et al., 2013). Indirect evaluation has a limitation, in that the data are based solely on self-report and not objective. However, it is challenging to obtain a nuanced understanding of students' learning behavior or the process of learning only through direct evaluation. Hence, both methods used together may complement each other. Previous studies showed that indirect evaluation results correspond with those of direct evaluation (Anaya, 1999; Yamada, 2012). Some studies also highlighted the importance of qualitative methods for obtaining new insights (Erikson et al., 2016).

In this study, the author conducted the survey (indirect evaluation) thrice: at the beginning of the course (orientation day, before the classes begin; pre-term survey), in the middle of the course (week 7 for 2015 and week 8 for 2016 and 2017; mid-term survey) and at the end of the course (week 13 for 2015 and week 15 for 2016 and 2017; post-term survey) to understand the change in students' learning skills, including their perceptions and confidence regarding the course. Indirect evaluation was used for this study since the author aimed to understand students' learning process and how they perceived their learning skills. To analyze students' learning skills that are specifically applicable to contents of this course, survey items were developed based on the syllabus of the Introductory Seminar. Survey items included questions regarding the following learning skills that students were expected to acquire through the Introductory Seminar (Table 2):

  1. (1)

    Understanding of policy science (Q1)

  2. (2)

    Understanding of group work and discussion, as well as the confidence to state their opinion (Q2, 3)

  3. (3)

    Understanding of a report/paper and the confidence to write a report (Q4, 5)

  4. (4)

    Understanding of a presentation and the confidence to present well (Q6, 7)

Additionally, questions were included assuming that educational effects include not only what students learned in the class but also their change in perceptions toward student life (e.g. how much they adapted to student life).

  1. (5)

    Students' worries and expectations, as well as what they wished to study at the College of Policy Science (Q8, 9, 10).

The answers for these ten questions (Q1–10) related to students' learning skills were based on a 7-point Likert scale (1 = Strongly disagree, and 7 = Strongly agree). Finally, three open-ended questions, (1) “what is your opinion of policy science?” (2) “how do you define policy science?” and (3) “what do you expect from university life?” were asked in the survey. While these three open-ended questions do not directly measure the students' learning skills, they provide insights into students' understanding of the course contents and their perceptions of the university life.

Before each survey, the author explained the contents and objectives of this research. Furthermore, the anonymity of all the data obtained was ensured. Students were informed that their participation was voluntary. The survey was distributed once the author obtained informed consent from all the students.

While a pilot study was not conducted due to time limitations before the first survey, based on the pre-term survey of 2015, the author acknowledged that students understood the questions well and the items representing students' learning skills that directly relate to contents taught in the course (Q1-7), showed high reliability (Cronbach α = 0.866). These items showed high reliability for the pre-term surveys of 2016 and 2017 as well (Cronbach α = 0.887 for 2016 and 0.882 for 2017). Therefore, the author confirmed the validity and reliability of the items; the same items were used in 2016 and 2017.

Theoretical background

The author expected to see changes in students' learning skills (how confident they were regarding the understanding of course contents and their learning skills) based on the concept of self-efficacy. Self-efficacy is a belief that one has the ability to be successful in a specific situation; it affects their motivation to take action (Krasny, 2020). Previous studies also used items related to self-efficacy to measure students' learning skills (Ayyildiz and Tarhan, 2015; Tekkol and Demirel, 2018). Efficacy could be built through various strategies, such as providing environments where students can work on challenging mastery experiences, as well as by feedback given by teachers and peers on their performance. The course includes many of these experiences where students can work on challenging assignments and exchange feedback with each other.

For analyzing the open-ended questions, the author followed the grounded theory, an approach to collect data without preconceived hypotheses or assumptions (Brinkmann and Kvale, 2015). While the author had hypothesized that students' learning skills would increase after taking the course (measured via the Likert-scale questions), for open-ended questions, the author aimed to understand the variance in students' perceptions and develop ideas based on the data.

Analysis

For the 10 items with Likert-scale-based responses, the analysis of variance (ANOVA) and a stepwise multiple regression were used for statistical analysis. First, ANOVA was used to identify differences in average scores in pre-term, mid-term and post-term surveys. For the data collected for 2015, although a similar ANOVA was conducted (Sakurai, 2017), in this research (with samples of 2015, 2016 and 2017), post-hoc comparison was carried out using appropriate methods for different types of data; Tukey's honestly significant difference test was used for the data in which the same variance was assumed, whereas the Games–Howell test was used for the data wherein the same variance was not assumed.

Furthermore, a stepwise multiple regression was conducted to understand which variables (among those covered through Q1–9 and gender) affect students' idea of what they want to study. Hence, ANOVA was performed separately for the data of the three years while a stepwise multiple regression was performed using data from all three years (using the post-term survey) so that there would be enough sample size (>80) to conduct regression analysis. All statistical analyses were conducted using SPSS software Version 22 (IBM, Tokyo, Japan) with a significance level of p < 0.05.

For the open-ended questions, text mining analysis—which is suitable for mechanically analyzing a good amount of text data (in this research, the data consisted of more than 1,000 sentences)—was conducted using KH Coder Version 3 (Higuchi, 2021). Data for all three years were analyzed together, and the most frequently mentioned words in pre-term and post-term surveys were identified. These are listed in Table 3; the words were extracted using Jaccard score (level of relationship between each word and pre-term/post-term survey).

Results

All students taking the course responded to the survey; in 2015, 2016 and 2017 there were 29, 29 and 31 respondents, respectively. A post-hoc comparison of ANOVA revealed that most of the measured aspects significantly increased from pre-term to post-term surveys in all three years (Table 2); responses in 2016 for two items, “I know what presentation is” and “I am confident to make presentation properly in front of people” did not change significantly. As for the perceptions regarding student life, respondents' worries about the university life significantly decreased every year and there were no significant differences in respondents' excitement toward student life and clarity of what they wanted to study at the College of Policy Science between the pre-term and post-term surveys.

The stepwise multiple regression analysis (n = 84) revealed that four independent variables significantly affected the dependent variable (whether students had a clear vision regarding what they wanted to study at the college). Whether respondents knew what policy science was had the strongest effect (B = 0.271, p < 0.01) on the dependent variable while whether they were confident to write their opinions in reports (B = 0.264, p = 0.01), gender (B = −0.227, p = 0.01 [male students had higher clarity about what they wanted to study]), and whether they could state their opinion in group work or group discussion (B = 0.217, p = 0.05) had significant effects. Adjusted R2 was 0.380, implying that about 40% of the variance of the dependent variable was explained by these four independent variables. VIF scores for these four independent variables were less than 2.0, and it was concluded that the multi-collinearity level was low (Vaske, 2008).

For the open-ended questions, a total of 1,507 words with 1,302 sentences were extracted and used for the text mining analysis. The most frequently mentioned words by students in the pre-term survey included “anxiety,” “friends,” and “student organization,” while those in the post-term survey included “issues,” “solve,” “discipline” and “knowledge” (Table 3). Answers to the open-ended questions of the pre-term survey included “I have anxiety regarding if I can make friends” and “I have anxiety related to if I can get along with new friends in the university.” Conversely, there were fewer answers written regarding “friends” in the post-term survey; instead, the actual sentences included “I first worried if I can make friends but anxiety disappeared now.” While many students wrote “I do not know what policy science means” in the pre-term survey, many elaborated on this discipline and described policy science eloquently; the actual sentences in post-term survey included “(I now understand what policy science means.) It is an interdisciplinary approach that aims to gain various perspectives to solve current social issues” and “In policy science, we utilize various types of academic knowledge and suggest better policies.”

Discussion

Course contents and educational effects

Comparison of students' perceptions before, during and at the end of the first-year experience course revealed that while students' understanding of policy science and skills regarding group discussion, writing reports and presenting significantly improved, their worries regarding student life at the university significantly decreased. However, items related to students' excitement regarding their university life and their clarity about what they wanted to study did not change significantly from the pre-term to post-term surveys in all three years.

These findings indicate that, although the students differed every year, the same faculty teaching the same contents generated similar educational effects in different groups. While a previous study that used similar three-time surveys in a semester showed that students' self-efficacy decreased from mid-term to post-term surveys (Uemoto and Ito, 2016), the present study shows that students' self-efficacy learning skills generally increased over the semester. This could be because of the course contents, as the basic topic of policy science along with general presentation and writing skills were taught in this study while the previous study involved more specialized courses of educational psychology, including data analysis (Uemoto and Ito, 2016); therefore, respondents of this study might have found the class more comprehensible than those in the study by Uemoto and Ito (2016). Additionally, the methods used in the class—mostly based on an active-learning style (e.g. allowing students to hold weekly discussions and make class presentations)—could have contributed to the increase in self-efficacy (van Dinther et al., 2011) in this study. The findings also support previous studies regarding the theoretical foundations of self-efficacy; furthermore, they corroborate other results showing that providing an environment where students receive teacher and peer feedback could increase their efficacy (Krasny, 2020). Therefore, such a learning style shows potential universal effects. Hence, the survey instruments' reliability and validity and the learning and perceptual outcomes this course could generate are established to a significant extent. The two items related to the presentation skills not showing a significant increase in 2016 may be explained by the fact that these students already understood their presentation skills, as revealed by their pre-term survey results. In addition, the scores for this item for the mid-term (5.00) and post-term surveys (5.00) were identical. Hence, although similar educational effects may be expected with different students learning from the same teacher, other factors, such as students' understanding level at the beginning of the course, may influence the course's relative effectiveness.

Factors affecting students' vision of what they wanted to study

The stepwise multiple regression analysis revealed that those who (1) knew what policy science was, (2) had the confidence to write their opinions in a report properly and (3) could state their opinions during group work more likely had a clearer vision regarding their planned course of study. This implies that instructors could carefully teach students what policy science is, including concepts and history, to support students' plans for college courses. Additionally, by encouraging students to voice their opinions in group work and write reports, students' visions of what they want to study may be enhanced. Conversely, students who had a clear vision may have been able to write better reports (as they had clear goals) and state their opinions more clearly and confidently (as they knew their plans). Therefore, the direction of influence remains unclear. The additional regression analysis, although not the main focus of this research, showed a significant relationship among “how clearly students regard what they want to study” (independent variable), and “whether students felt they could write reports properly” and “whether students could express their opinions in groups.”

While the stepwise multiple regression analysis revealed which factors could affect students in envisioning study plans, of note, students' overall level of clarity about their future study did not change significantly in any of the three years. Perhaps the course should include more case studies in policy science for students to work on to increase their interest and incentive for studying this topic (Erikson et al., 2016). Furthermore, more interaction among students, faculty and university staff is necessary (Baik et al., 2019) to ensure that students clearly envision their university study plans.

Female students were less likely to have a clear vision about their study plans. A study showed that the first-year seminar had less effect on female than on male students regarding retention rate (Culver and Bowman, 2020). The reason for this is beyond the scope of this study. However, this study does provide potential suggestions for future instructors to consider gender differences when designing courses and for researchers to consider studying specific cognitive factors related to gender that could affect individuals' perceptions toward future study.

Changes in students' mindset from text-mining analysis

A text-mining analysis of this study's open-ended questions revealed changes in students' mindset during the course. At the beginning of the semester, most students were concerned about their campus life, including whether they could make friends (extracted words such as “anxiety” and “friends”); they were more likely to talk about class content in the post-term survey (extracted words such as “issues” and “solve,” which are keywords for policy science). These were also seen from the answers provided by students; some examples were related to how they could describe policy science's contents and aims, implying that students attained some important course learning outcomes. Wilcox et al. (2005) showed that making compatible friends is a critical factor for first-year students, affecting college life as well as the retention rate; therefore, the finding that students' anxiety related to friends decreased (as also shown from Likert-scale item regarding “worry”) is encouraging. Students seemed to attend more to what they learned in the post-term survey, implying that they were ready to focus on their university learning.

Limitations and potential future research

This study had some limitations. Students' self-efficacy and confidence in writing reports, giving presentations and speaking in group work were measured, whereas their grades in the course were not (since the relationship between grades and survey results was beyond the scope of this research). A previous study showed that self-efficacy directly affects first-year students' academic performance; those with high self-efficacy had high academic performance (Chemers et al., 2001). Therefore, the finding of this study regarding high confidence in study skills could potentially correspond to students' academic performance. Providing opportunities for students to reflect on their learning could help them develop their learners' identity (Erikson et al., 2016). Hence, in this course, the author encouraged students to answer a questionnaire during the semester, asking them to engage in the survey as a learning activity. Answering such questionnaires from the beginning of the course could have helped students understand learning expectations for this course (e.g. student skills). Future research should study students' course achievement (such as grades) to understand the relationship between their self-efficacy and achievement. Additionally, more detailed analyses (e.g. path analysis, structural equation modeling) could examine potential mediating and/or moderating variables that affect students' learning skills.

Conclusion

One of the important goals of course evaluation is to improve the quality of education/teaching in the course as well as of the whole institute. However, the quality of teaching also depends on the personal style of the teachers, how they teach and talk (Marsh, 1984; Erikson et al., 2016). Therefore, a survey similar to what has been done in this study should be conducted in other first-year experience courses as well. There are 12 classes for the first-year experience course at the College of Policy Science, and the results might be different in the other 11 classes taught by other instructors. Based on the data, a more detailed study could be conducted regarding which skills increase after which class (week) and how long the students' confidence and learning skills should be maintained. Instructors could use those results to confirm whether the expected learning outcomes are achieved and, if not, they could revise and adjust their course content even during the semester. The author understands that the survey itself does not fully explain the whole learning process that the students experience, and further detailed study is required. The findings of this study and the process of obtaining data could be used for the college as well as the university, to guide further evaluation. For example, by using the questionnaire format used in this survey (changing the objective words to the contents of the course taught), and conducting a series of surveys over a semester, the university could monitor students' progress until graduation.

To conclude, the importance of understanding students' achievements in learning at universities has been emphasized across the world; as for the effectiveness of the first-year experience course, only a limited number of studies have been conducted in Japan and other Asian countries. This study, based on a series of surveys conducted over three years on different batches of freshman, revealed that similar educational effects are generated over different years. The research approach utilized in this study can be applied in future research to enhance studies in this area internationally.

Contents of the course and assignment

Contents of the classAssignment
Orientation: April 4th [first survey conducted]Self-introduction by the teacher, ES (educational support senior students), and studentsFind newspaper articles that you think is about “policy” and summarize supporting and opposing viewpoints of that policy
First class: April 13th• Presentation by ES on “what policy is”Select one policy you are interested in and summarize, in a piece of paper, the status of implementation
• Lecture by the teacher on policy
• Group work: think about certain policies including their goal and approach
Second class: April 20th• Presentation by ES on “designing, implementing, and evaluating policies”Think about how you would like to spend your 4-year university life (1. Courses to take, 2. contents to study, 3. off-campus activities to experience, 4. topic you want to work on for your graduation research) to present in two minutes
• Lecture by the teacher on policy design and critical thinking
• Group work: think about ways in which we can evaluate the actual policies on the ground
Third class: April 27th• Presentation by ES on “fields of policy science” as well as explanation by the teacher on this topicNone
• Presentation by ES on what research they are working on
• Presentation by students on how they would like to spend 4 years of university life
Fourth class: May 11th• Presentation by ES on “group work” and explanation by the teacher on this topicRead one paper out of four lists of academic papers on policy, and summarize research goal, methods, conclusion and significance of the study
• Group work: deciding class leader and sub-leader/defining policy science in a group
Fifth class: May 18th• Presentation by ES on “academic reading”• Find a research paper you are interested in and summarize the contents and weaknesses of the research in a paper
• Lecture by the teacher on how to read academic papers
• Group work: present about the papers read
• Decide and present the most well-written report in the group
Sixth class: May 25th• Presentation by ES on academic writing• Think about what you would write in the report assignmenta (that would be submitted by June 15th)
• Summarize introduction of your report assignment, list of contents and references in a paper
• Lecture by the teacher on how to write reports; part 1
• Writing exercise: revise the homework other students submitted
• Group work: sharing homework submitted and parts revised by other students
• Presentation from each group regarding what they shared
Seventh class: June 1st [second survey conducted]• Lecture by the teacher on how to write reports; part 2• Make a three-minute presentation regarding what you wrote in the report assignment (1. Title, 2. Introduction, 3. Flow of the manuscript, 4. Conclusion, 5. References)
• Writing exercise: students revise the homework that other students submitted
• Group work: sharing homework submitted and parts revised by other students
• Presentation from each group regarding what they shared
• Presentation by ES on what presentation is
Eighth class: June 6th• Lecture by the teacher on how to write reports; part 3
• Presentation by students regarding the report assignment
No homework
Ninth class: June 8th• Presentation by students regarding the report assignmentSubmit draft of report assignment to ES
Tenth class: June 15th• Presentation by ES on what debate is• Read literature related to the role you would play in the workshop
• Summarize feature of the person you would play in the role-playing workshop in a paper
• Lecture by the teacher regarding debate, workshop, facilitation/additional lecture regarding the theme of the debate (wildlife issues in Japan)
• Deciding role of each student to be played in a role playing workshop
Eleventh class: June 22nd• Group work (share the feature of the role you would play in the workshop)• Summarize what you would advocate in the workshop (based on the role you would play) in a paper
• Explanation by ES on the contents and flow of debate that will be held on 13th class
Twelfth class: June 29th• Workshop (aimed to reach consensus among stakeholders, including farmers, government officials and hunters that students play to solve human-monkey problems)• Collect information and prepare for debate implemented in the following week
Thirteenth class: July 6th [third survey conducted]• Debate: Supporters and opponents discuss about three topics (e.g. immigration policy)

Note(s): aIn a report, students were assigned to write about the types of research conducted and policies implemented related to one social problem that they chose

Student scores for pre-term/mid-term/post-term course surveys over the three years and results of post hoc comparison (e.g. “Pr < Mi/Po” means that the post-term and mid-term scores were significantly higher than their pre-term scores while there was no significant difference between mid-term and post-term scores) (2015: n = 29, 2016: n = 24–29 [n = 29 for the pre-term survey, n = 27 for the mid-term survey, n = 24 for the post-term survey], 2017: n = 31)

 ItemsYears*MeanMultiple comparison
Pre-termMid-termPost-term
Q1. I have good knowledge of policy sciencea3.034.145.03Pr < Mi < Po
b3.284.745.08Pr < Mi/Po
c3.654.484.94Pr < Mi/Po
Q2. I know what group work/group discussions area4.105.215.45Pr < Mi/Po
b4.245.195.25Pr < Mi/Po
c4.325.005.39Pr < Po
Q3. I can state my own opinion in group work or group discussionsa4.214.865.10Pr < Po
b4.285.115.46Pr < Mi/Po
c4.294.845.13Pr < Po
Q4. I know what a report or paper isa3.664.555.21Pr < Mi < Po
b3.554.894.96Pr < Mi/Po
c3.425.105.45Pr < Mi/Po
Q5. I am confident in my ability to write my opinions in reports properlya3.344.144.72Pr < Mi/Po
b3.724.674.63Pr < Po
c3.944.525.03Pr < Po
Q6. I know what a presentation isa4.074.725.34Pr/Mi < Po
b4.345.005.00Pr/Mi/Po
c3.975.035.48Pr < Mi/Po
Q7. I am confident about making a presentation in front of peoplea3.453.724.59Pr < Po
b3.614.154.46Pr/Mi/Po
c3.714.264.81Pr < Po
Q8. I worry about my student life at the universitya5.383.663.83Pr > Mi/Po
b5.003.153.58Pr > Mi/Po
c5.003.613.45Pr > Mi/Po
Q9. I am excited about the student life at the universitya5.765.795.90Pr/Mi/Po
b5.215.335.67Pr/Mi/Po
c5.454.975.03Pr/Mi/Po
Q10. I am clear about what I want to study at the College of Policy Sciencea3.934.104.62Pr/Mi/Po
b4.104.784.96Pr/Mi/Po
c4.164.654.65Pr/Mi/Po

Note(s): *a = 2015, b = 2016, c = 2017

Most frequently mentioned words from the pre-term and post-term surveys over three years

RankingPre-termPost-term
WordsFrequenciesJaccardWordsFrequenciesJaccard
1Anxiety460.438Issues600.417
2Friends290.322Solve480.361
3Know300.313Discipline360.336
4Policy320.283Variety of270.270
5Society330.277Fast240.264
6Learn300.259Field250.238
7Student organization240.258Think250.223
8Class260.248Various perspectives190.204
9Interest240.238Not really190.200
10Think260.232Credits200.196
11Me260.230Perspectives160.174
12Politics210.204Views150.163
13Various200.200Fun140.151
14Study190.194Time130.143
15College190.192Knowledge130.133

References

Anaya, G. (1999), “College impact on student learning: comparing the use of self-reported gains, standardized test scores, and college grades”, Research in Higher Education, Vol. 40 No. 5, pp. 499-526, doi: 10.1023/A:1018744326915.

Association of First Year Experience Course (2014), Current Situation and Future of First Year Experience Course, Sekaishisosha, Kyoto (in Japanese).

Ayyildiz, Y. and Tarhan, L. (2015), “Development of the self-directed learning skills scale”, International Journal of Lifelong Education, Vol. 34 No. 6, pp. 663-679, doi: 10.1080/02601370.2015.1091393.

Baik, C., Naylor, R., Arkoudis, S. and Dabrowski, A. (2019), “Examining the experiences of first-year students with low tertiary admission scores in Australian universities”, Studies in Higher Education, Vol. 44 No. 3, pp. 526-538, doi: 10.1080/03075079.2017.1383376.

Brinkmann, S. and Kvale, S. (2015), InterViews: Learning the Craft of Qualitative Research Interviewing, 3rd ed., Sage, Thousand Oaks, CA.

Chemers, M.M., Hu, L. and Garcia, B.F. (2001), “Academic self-efficacy and first-year college student performance and adjustment”, Journal of Educational Psychology, Vol. 93 No. 1, pp. 55-64, doi: 10.1037/0022-0663.93.1.55.

Culver, K.C. and Bowman, N.A. (2020), “Is what glitters really gold? A quasi- experimental study of first-year seminars and college student success”, Research in Higher Education, Vol. 61 No. 2, pp. 167-196, doi: 10.1007/s11162-019-09558-8.

Ding, F. and Curtis, F. (2020), “‘I feel lost and somehow messy’: a narrative inquiry into the identity struggle of a first-year university student”, Higher Education Research and Development, pp. 1-15, doi: 10.1080/07294360.2020.1804333.

Erikson, M., Erikson, M.G. and Punzi, E. (2016), “Student responses to a reflexive course evaluation”, Reflective Practice, Vol. 17 No. 6, pp. 663-675, doi: 10.1080/14623943.2016.1206877.

Freeman, R. and Dobbins, K. (2013), “Are we serious about enhancing courses? Using the principles of assessment for learning to enhance course evaluation”, Assessment and Evaluation in Higher Education, Vol. 38 No. 2, pp. 142-151, doi: 10.1080/02602938.2011.611589.

Fujiki, K., Hamana, A., Hayashi, T., Mochizuki, M. and Ozeki, S. (2020), “Visualizing learning outcomes and their future: fostering a culture of assessment”, Journal of Japan Association for College and University Education, Vol. 41 No. 2, pp. 71-75, (in Japanese with English abstract).

Hamana, A., Kawashima, T., Yamada, R. and Ogasawara, M. (2013), 30 Keywords for Making University Reform Successful: In Order to Survive in ‘University's Winter Time’, Gakuji Shuppan Kabushiki Gaisha, Tokyo, (in Japanese).

Hatano, K., Uegaki, Y. and Takahashi, T. (2015), “Is active learning related to learning outcomes?: focusing on the relationship between the change in active learning and learning outcomes for three-years in undergraduate program”, Journal of Japan Association for College and University Education, Vol. 37 No. 1, pp. 86-94, (in Japanese with English abstract).

Higuchi, K. (2021), “KH coder”, (in Japanese), available at: https://khcoder.net/ (accessed 21 January 2021).

Kawanabe, T., Kasahara, K. and Torii, T. (2013), “Development of student survey methods for institutional research: a study on the process of academic performance change by the combined use of quantitative and qualitative approaches”, Ritsumeikan Higher Educational Studies, Vol. 13, pp. 61-74, (in Japanese with English abstract).

Krasny, M.E. (2020), Advancing Environmental Education Practice, Cornell University Press, Ithaca, NY.

Marsh, H.W. (1984), “Students' evaluations of university teaching: dimensionality, reliability, validity, potential biases, and utility”, Journal of Educational Psychology, Vol. 76 No. 5, pp. 707-754, doi: 10.1037/0022-0663.76.5.707.

Matsushita, K. (2012), “Assessment of the quality of learning through performance assessment: based on the analysis of types of learning assessment”, Kyoto University Researches in Higher Education, Vol. 18, pp. 75-114, (in Japanese with English abstract).

Matsushita, K. (2020), “Combining course- and program-level outcomes assessments: pivotal embedded performance assessment's theory and its challenges”, Journal of Japan Association for College and University Education, Vol. 42 No. 1, pp. 77-81, (in Japanese with English abstract).

Mori, T. and Yamada, T. (2009), “Effects of collaborative education in the first year experience, and its process: scaffolding between students”, Kyoto University Researches in Higher Education, Vol. 15, pp. 37-46, (in Japanese with English abstract).

Parker, J.D.A., Summerfeldt, L.J., Hogan, M.J. and Majeski, S.A. (2004), “Emotional intelligence and academic success: examining the transition from high school to university”, Personality and Individual Differences, Vol. 36, pp. 163-172, doi: 10.1016/s0191-8869(03)00076-x.

Sakurai, R. (2017), “Educational effects of a mandatory first year experience course: what do students at college of policy science learn in the introductory seminar?”, Ritsumeikan Higher Educational Studies, Vol. 17, pp. 151-164 (in Japanese with English abstract).

Sakurai, R., Tsunoda, H., Enari, H., Siemer, W.F., Uehara, T. and Stedman, R.C. (2020), “Factors affecting attitudes toward reintroduction of wolves in Japan”, Global Ecology and Conservation, Vol. 22, doi: 10.1016/j.gecco.2020.e01036.

Tekkol, I.A. and Demirel, M. (2018), “An investigation of self-directed learning skills of undergraduate students”, Frontiers in Psychology, Vol. 9, p. 2324, doi: 10.3389/fpsyg.2018.02324.

Uemoto, T. and Ito, T. (2016), “The relationship between self-efficacy, intrinsic value and emotional engagement: using a cross-lagged panel model”, Educational Technology Review, Vol. 39 No. 1, pp. 145-154.

van Dinther, M., Dochy, F. and Segers, M. (2011), “Factors affecting students’ self-efficacy in higher education”, Educational Research Review, Vol. 6, pp. 95-108, doi: 10.1016/j.edurev.2010.10.003.

Vaske, J.J. (2008), Survey Research and Analysis: Applications in Parks, Recreation, and Human Dimensions, Venture Publishing, Shreveport, LA.

Virtanen, P. and Nevgi, A. (2010), “Disciplinary and gender differences among higher education students in self-regulated learning strategies”, Educational Psychology, Vol. 30 No. 3, pp. 323-347, doi: 10.1080/01443411003606391.

Wilcox, P., Winn, S. and Fyvie-Gauld, M. (2005), “‘It was nothing to do with the university, it was just the people’: the role of social support in the first-year experience of higher education”, Studies in Higher Education, Vol. 30 No. 6, pp. 707-722, doi: 10.1080/03075070500340036.

Yamada, R. (2012), Towards Guaranteeing Quality of Undergraduate School Education: Facts Revealed from Student Survey and First Year Experience, Toshindo, Tokyo, (in Japanese).

Yamada, R. (2016), “Correlation between direct and indirect assessment of general education: research results and challenges”, Journal of Japan Association for College and University Education, Vol. 38 No. 1, pp. 42-48, (in Japanese).

Acknowledgements

Declaration of interest: No potential conflict of interest was reported by the author.

Corresponding author

Ryo Sakurai can be contacted at: ryo223sak@gmail.com

Related articles