Level of Learning Assessed through Written Examinations in Social Science Courses in Tertiary Education: A Study from Bangladesh

Main Article Content

Bentul Mawa
M. Mojammel Haque
M. Mojammel Haque
M. Mozahar Ali

Abstract

This paper examines students’ cognitive learning outcomes assessed through semester final written examinations in Social Science Courses in
tertiary level education. The study used a content analysis method to analyze 125-semester final written exam papers (tests) of 52 courses of B.Sc.
Ag Econ. (Hons) degree program of Bangladesh Agricultural University. The study revealed that written exam papers mostly cover ‘remember'
and ‘understanding' (18% and 60%) level of learning while ‘apply-analyze-evaluate and create' levels cover only 22 percent. Year-wise change in
lower-order learning assessed (remember) showed a slightly decreasing trend while others showed an increasing trend to adjust that change.
Level-wise (L1 to L4) increasing trend in order was observed only for ‘understanding' while all others showed no definite change pattern. The
study concludes that the assessment occurs mainly at lower order learning, and it does not progress with the level of studies (L1 to L4). The
existing written exam strategy is not suitable to assess higher-order learning to satisfy ‘critical thinking and decision making' outcome so that
students become better equipped for the existing job market and the rapid changing world. The program requires changing its assessment
strategy to ensure higher-order learning.
Journal of Teacher Education and Research (2019). DOI: 10.36268/JTER/1413

Article Details

How to Cite
1.
Mawa B, Haque M, Haque M, Mozahar Ali M. Level of Learning Assessed through Written Examinations in Social Science Courses in Tertiary Education: A Study from Bangladesh. JTER [Internet]. 5Oct.2019 [cited 24Apr.2024];14(01):09-4. Available from: https://jter.in/index.php/JTER/article/view/14
Section
Research Article

References

Abosalem, Y. (2016). Assessment Techniques and Students’ Higher-Order
Thinking Skills. International Journal of Secondary Education, 4(1),
1-11.
Airasian, P. W. (1994). Classroom assessment. New York, NY: McGraw Hill.
Benjamin, R. (2008). The Case for Comparative Institutional Assessment of
Higher-Order Thinking Skills. Change, 40(6), 51-55.
Beyer, B. (1983). Common sense about teaching thinking. Educational
Leadership, 41(3), 44-49.
Bloom, B. S. (1956). Taxonomy of Educational Objectives, Handbook: The
Cognitive Domain. New York: David McKay.
Boyd, B. (2008). Effects of state tests on classroom test items in mathematics,
School Science and Mathematics, 108 (6), 251-261.
Budd, R. W., Thorp, R. K., & Donohew, L. (1967). Content analysis of
communications. New York: Macmillan.
Calfee, R. C. (1994). Cognitive assessment of classroom learning. Education
and Urban Society, 26(4), 340-351.
Earl, L., & Katz, S. (2006). Rethinking Classroom Assessment With Purpose
in Mind. Winnipeg, Manitoba: Western Northern Canadian Protocol.
Hsieh, H. & Shannon, S. E. (2005). Three Approaches to Qualitative Content
Analysis. Qualitative Health Research, 15(9), 1277-1288.
McTavish, D. G., & Pirro, E. B. (1990). Contextual content analysis. Quality
and Quantity, 24, 245-265.
Newman, F. M. (1990). Higher Order Thinking in Teaching Social Studies: A
rationale for the assessment of Classroom Thoughtfulness. Journal
of Curriculum Studies, 22(1), 41-56.
Pellegrino, J., Chudowsky, N. & Glaser, R. (Eds.). (2001). Knowing what
students know: The science and design of educational assessment:
A report of the National Research Council. Washington DC: National
Academy Press.
Self Assessment Manual. (2016). University Grants Commission of
Bangladesh. Ministry of Education. Higher Education QualityEnhancement Project (HEQEP). University Grants Commission of
Bangladesh.
Self Assessment Report. (2018). B.Sc. Ag. Econ. (Hons.) Degree Program,
Faculty of Agricultural Economics and Rural Sociology, Bangladesh
Agricultural University.
Stiggins, R. J., Frisbie, D. A. & Griswold, P. A. (1989). Inside high school grading
practices: Building a research agenda. Educational Measurement:
Issues and Practices, 8(2), 5-14.
Struyven, K., Dochy, F., Janssens, S., Schelfhout, W., & Gielen, S. (2006). The
overall effects of end-of-course assessment on student performance:
A comparison between multiple-choice testing, peer assessments,
case–based assessment and portfolio assessment. Studies in Educational
Evaluation, 32(3), 202-222.
Tesch, R. (1990). Qualitative research: Analysis types and software tools.
Bristol: PA: Falmer.
Tremblay, K., Lalancette, D. & Roseveare, D. (2012). Assessment of Higher
Education Learning Outcomes. Feasibility Study Report, OECD.
Retrieved from http://www.oecd.org/education/skills-beyondschool/
AHELOFSReportVolume1.pdf
U.G.C. (2015). Background of the Quality Assurance Unit (QAU). University
Grants Commission of Bangladesh. Retrieved from http://www.qau.
gov.bd/page/background-quality-assurance-unit-qau\
Wiggins, G. (1994). Toward more authentic assessment of language
performances. In C. Hancock (Ed.), Teaching, testing, and assessment:
Making the connection. Northeast Conference Reports. Lincolnwood,
IL: National Textbook Co.
Wraga, W. G. (1994). Performance assessment: A golden opportunity to
improve the future. NASSP Bulletin, 78(563), 71-79.