Assessment Effectiveness and Anxiety: Students’ Perception

Traditional methods of assessing university students’ speaking and writing abilities, especially those in creative design fields, can be perceived both impractical and monotonous. This study aims to show college students’ perception of the degree of effectiveness of the tools currently being used to assess them, either through authentic assessment or through standardized testing, and whether or not anxiety plays any role in their performance. 21 graphic design students at a private university in Madrid taking a course in advanced English for Specific purposes (ESP) completed the survey. The survey, mostly qualitative, asked students to evaluate how effective were the different forms of authentic assessment, both in speaking and writing, compared to the standardized tests they were mainly and currently evaluated on. The results of the survey found that students, in general, deemed the various forms of authentic assessment more effective, albeit not in a significant way. Similarly, there was no clear difference between the anxiety levels authentic assessment produced versus standardized and classical formative assessment. Not surprisingly however, most students preferred the use of social media platforms, such as Instagram as a form of writing assessment, even though they did not consider it valid. Hopefully, this paper will have positive implications to encourage syllabus designers and material developers to consider students’ perceptions and preferences on the assessment process while keeping in mind what their fields of choice expect once they become professionals, as current trends and attitudes on assessment should be more in line with the industry.


Introduction
In Spain, there are only a handful of colleges and universities that offer a degree in Graphic Design, Interior or Product design and Video Game design. An even smaller number of these schools actually offer specialization subjects or subject matter content taught in English. The survey that serves as the basis for this paper focuses on the feedback obtained from the students attending one of the private universities specialized in design in Madrid.
The university´s English curriculum spans only 2 years in which first year students attend fifteen 2-hour weekly sessions of intermediate English, whereas, in the second year, students attend the same 2-hour weekly sessions of advanced English for Specific Purposes (ESP) throughout the school term. During their junior and senior year, there are very few schoolprogrammed opportunities where students can practice or have exposure to English, except during planned student-exchanges, conferences and internships. This lack of exposure has been exacerbated by the Covid-19 pandemic.
The university syllabus for English for Design is one in which, first, the academic department decides its content at the beginning of the year, and secondly, is mostly structured and synthetic (Long & Crooke, 1992). The starting point for the structural syllabus is primarily grammar, which has to be covered discretely and sequentially each week, followed by general and specific vocabulary, with the final attention focused on assessment. This is done to enforce uniformity among teachers teaching the same levels. Student evaluation is heavily weighted in favor of a midterm and a final exam, which represent 50 percent of their overall grade, 40 percent are traditional tasks and 10 percent is the teacher´s assessment. In fact, the process of assessing English as a Foreign Language (EFL) has resulted in a series of traditional standardized tests, which have been outsourced to external bodies alien to Spanish culture and practices (Porras, 2013).
Furthermore, owing to a cultural practice, which is thankfully, quickly disappearing, students are penalized for ´guessing the answer´ or leaving a blank space in multiple-choice questions in standardized tests which some studies have pointed out, could lead to test anxiety (Traub & Hambleton, 1972). As a result, Spanish educational practitioners across the board have begun to question what educational value do these assessments have, how and when to incorporate assessment into instructional planning or if these assessments are authentic, valid and reliable enough (San Roman, 2016).
The main aim of this paper is to show college students´ perception of the degree of effectiveness of the different tools of assessment. A secondary aim focuses on determining whether anxiety plays any role on student performance. It is deemed necessary to provide a discussion tool that will drive change among the school curriculum, syllabus and material designers in a manner that will clearly articulate the school´s philosophy to the benefit of their design students and future professionals. The proposed syllabus for sophomore students of English for Specific Purpose (ESP) should be one in which clearly outlined objectives, authentic forms of assessment, and the students themselves, are the guiding principle.
In higher education, students should be required to use and develop higher other thinking skills (Bloom, 1956). Authentic forms of assessment normally are more effective in evaluating the students' capacity to create, evaluate and analyze. Standard and traditional forms of assessment only require students to show they understand or reproduce something from memory (Gulikers, et al, 2004). To that end, a syllabus for advanced English should facilitate the learning of the skills and competences students will actually need once they enter the labor market.
In fact, there should be an alignment between what students learn throughout the four years they attend college and the skills they will need to have acquired upon graduation. However, students, and even some educators worry that there is a lack of proper authentic assessment that will help students become consummated professionals with good language skills. Moreover, curriculums and syllabus for higher education should be designed to make students more employable considering what the labor market expects. Therefore, if college students understand the assessment process and participate in its design, their performance is likely to increase and test anxiety should also decrease while the whole learning process becomes more attractive. To that end, the survey conducted on this particular set of second-year design students aims at starting a discussion around the assessment process with student participation and their perception of how effective the different assessment tools used throughout the 2018-2019 school year have been, in the forefront, and how much anxiety they associate to each type of authentic assessment and/or standardized tests in order to reduce it.
Before outlining the different types of assessment tools used in the 2018-2019 academic year, both authentic assessment and traditional forms of assessment will be defined following what several authors have written about its characteristics. Overall, authentic assessment should be an inseparable component of teaching (Sarisu, 2018), while Gulikers, Bastiens & Kirschner (2004) state that authentic assessment is not contrived and requires more higher learning thinking skills. They go on to list some of the main characteristics of authentic tasks, which include student perception of the tasks as authentic, although the student perception of authenticity may differ from that of the teacher; it should include elements of self and peer assessment, and not surprisingly, client assessment and assessment of realistic projects. We interpret client assessment as the recruiting teams who will evaluate and test students' skills and abilities upon entering their first job or internship.
Throughout the school year, despite not included in the syllabus, several forms of authentic assessment were introduced alongside the traditional ones established by the English department. Students participated in debates regarding the role of inclusivity and sustainability in their field. They had to prepare and deliver a Pecha Kucha1 on how they saw their respective fields evolving in the future. Self-assessments and peer-to-peer feedback were also introduced to give students a feeling of how to give constructive criticism, and they also finally started building their digital portfolios. These forms of authentic assessment required students to combine the knowledge, skills and attitudes they need in their professional lives (Gulikers, et al, 2004). In the same vein, Mueller (2006) goes on to suggest that by having students being assessed with authentic tools, they will be performing the tasks they will be expected to replicate with proficiency.
According to Ali, Zeraatpishe, & Faravani (2019), another quality of authentic assessment is that it takes into account student´s individual learning styles and that there is a correlation between their learning styles and their preferred form of assessment. However, typically teachers do not develop their assessment tools considering the different learning styles and mainly focus on classical assessment tools.
New traditional forms of assessment were also introduced, as technology enhanced quizzes and graphic organizers. Although students found them exciting and enjoyable, it is true as Gulikers, Bastiens & Kirschner (2004) point out, that traditional assessment mostly uses lower order thinking skills like recall and recognize and it is not completely structured around the students´ needs.
In terms of students´ assessment preferences and their perception on each tool effectiveness and derived anxiety, Sarusi and Buyukkarci (2018) conducted a qualitative and quantitative study among Turkish students that found a relationship between test-taking and test anxiety. They issued in on the impact anxiety has on student performance and found that it is a personal phenomenon that influences performance and personal development negatively.
Focusing attention on students´ satisfaction or lack thereof regarding formative assessment, some studies suggest that students perceive it as a way to ´judge´ them instead of enabling learning and are ever-present (Hirst, 2016). This resonates with the feedback received in the survey performed.

Methods
The students who participated in the survey were mostly Spanish speaking undergraduate students in their majority from Spain, and some from Venezuela and Peru, with a few exceptions from Italy and Morocco. The small sample of participants only included those on their second year studying a degree in Graphic Design. Statistics about the students´ ethnicity, gender and income level was not allowed by the school. Despite this, it can be reported that the university´s student body is multi-ethnic and diverse. In addition, considering that tuition for one year is more than nine thousand Euros, it can be concluded that the standard of living of students is above average. Therefore, more than 75 percent of the students have travelled aboard, have lived abroad in an English-speaking country at some point or have received a First Certificate in English.
Through a Google Forms survey, students were asked to evaluate the effectiveness of both authentic forms of assessment and traditional ones. On top of each section, instructions for its completion and the definitions for authentic assessment and standardized tests were typed. The purpose of the survey was also included in general terms, which was to help the English department design a more effective learning experience. Next to the columns with the different degrees of effectiveness, students also had to mark with an X if they felt the particular type of assessment was a source of anxiety. Finally, at the end of the survey, students had to choose how frequently they thought assessments should be carried out throughout the year.

Procedure
Having access to only this group of students, the survey was handed to students during class time on the first Thursday of May 2019. The first group who completed the survey was the Graphic Design students. Out of the 31 students enrolled in Graphic Design only 21 were in attendance. To begin with, since students had just finished making a Pecha Kucha1 critiquing several iconic posters, students were asked to say how effective they thought Pecha Kuchas1 were as a form of assessment. In addition, students were asked if they felt anxious while preparing or delivering their presentations. Finally, students were also asked if they preferred to have more of this type of assessment versus the exams, they would normally have midterm and at the end of the school year. The students´ initial response was that they found Pecha Kuchas more effective with half of the class stating that they did not feel anxious while preparing or delivering the Pecha Kuchas. In terms of how often they should be evaluated, the general consensus was that exams should be done only twice a year. Responses on how often Pecha Kuchas were to be done in class were mute. Students were then told that in order to help the English department make more effective form of assessment in line with their professional needs, they will be need to fill out a survey asking them to grade the level of effectiveness on the different forms of assessment used in class throughout the school year. Before having students complete the survey, a description of the survey, as well as, the definition of authentic assessment versus standardized tests was offered.

Results and Findings
Not all students marked all the options and some of them marked the NOT SURE option. As for the level of anxiety, some students did not mark any options. The following figure (1) shows how Graphic Design students perceive the level of effectiveness of the different forms of authentic assessment: Prince, 2021 IJHEP, Vol. 2, No. 1, 32-39 36  (1) illustrates how graphic design students, in general found authentic forms of assessment in the range of very effective and effective. It was surprising, however, that a few students, thought that Pecha Kuchas, self-assessment and peer-assessment were not effective.
The following figure (2) shows how Graphic Design students perceive the level of effectiveness of the different forms of traditional assessments. Interestingly enough, a good number marked that they thought that traditional forms of assessment were effective with few of them marking them as effective as authentic forms of assessments. On the whole, the differences in effectiveness between authentic forms of assessment compared to traditional ones were negligible. One can wonder if this happened as a result of the students having being assessed mostly through classical methods, and that there are easier to prepare. The slight difference in effectiveness may also be due to how common it is for students to do standardized tests and may have internalized the process as effective. That may also be the reason why very few answered the question about anxiety.
These results greatly varied to those of Sarusi and Buyukkarci (2018) study in which students mostly prefer authentic or alternative forms of assessment. Furthermore, according to Ali, Zeraatpishe & Faravani (2019) there could be an explanation for this disparity, as students might not have an adequate understanding of the forms of assessment, or simply because they prefer standardized tests because there are easier to cheat.
The following graphs (3 and 4) show how Graphic Design students perceive anxiety derived from authentic form of assessment. The responses were too negligible to draw up a conclusion.  Another reason why the results were not groundbreaking could be attributed to the deficiency in the tools for data collection used. Nevertheless, it does not take away from the fact that there is sufficient evidence found from different studies that suggest that authentic forms of assessment are more effective. For instance, Hirst (2016) argues that Pecha kuchas are effective as both formative assessment and summative assessment, as well as being a good pedagogical tool. Hirst (2016) goes on to suggest that higher education students should take an active role in developing robust authentic forms of assessment whereby they attempt to monitor and regulate their own learning. In terms of the advantages of using portfolios as assessment, Chitpin (2003) argues that portfolios provide information about students´ progress while also allowing students with different learning styles to shine whereas with traditional forms of assessment students cannot really express their potential or show what they can do in a real-world setting.  Finally, Figure 6, shows that overall students' marks had improved considerably thanks to the use of authentic form of assessment. Just as relevant, this data was also supported by the fact that students´ participation was remarkably higher in non-mandatory authentic forms of assessment compared to standardized forms of homework and assignments. Figure 7 shows the percentage of students´ participation in authentic form of assessment compared to traditional tasks and homework assignment.

Conclusion
This paper sets out to shed light on students´ perception of the level of effectiveness, authentic forms of assessment have in comparison to traditional methods, and the perceived level of anxiety associated with test-taking.
That said, showing that they possess and master the skills to work in a real-world setting has become increasingly essential to students, especially considering the demands of the labor market which represents the level of effectiveness. Therefore, the tools used to comprehensively assess students´ capabilities should be transparent and effective. In real life students have to give pitches, make presentations, provide feedback. As a result, they should be evaluated with more obvious form of assessment, especially for students of advanced English for Specific purpose.
Although the results were not significant enough students´ view of assessment effectiveness is sufficient to have them participate and make them accountable for their own learning. It also improves morale and classroom management as the classes become more dynamic and engaging. Moreover, authentic assessment takes into account individual preferences and learning styles.
Despite the challenges of designing a syllabus where students play an active role while still reporting to the school board, the forms of assessment should be effective in the way that are current and relevant (Chitpin, 2003). Hopefully this paper will have positive implications that will drive syllabus designers and material developers to design more holistic, analytical, taskbased syllabus. Overall, authentic assessment should drive the curriculum.
A more robust paper would have included a study with a qualitative analysis offering data on gender, nationalities, ethnicities, as well as students´ assessment preferences, in addition to an interview with open-ended questions like what is the purpose of assessment, rather than a check in a box questionnaire.