International Journal of Social Science & Economic Research
Submit Paper

Title:
TEACHING STUDENTS TO SELF-ASSESS USING COGNITIVE STRUCTURE ANALYSIS: HELPING STUDENTS DETERMINE WHAT THEY DO AND DO NOT KNOW

Authors:
Celeste Cynkin and John Leddo

|| ||

Celeste Cynkin and John Leddo
MyEdMaster, LLC

MLA 8
Cynkin, Celeste, and John Leddo. "TEACHING STUDENTS TO SELF-ASSESS USING COGNITIVE STRUCTURE ANALYSIS: HELPING STUDENTS DETERMINE WHAT THEY DO AND DO NOT KNOW." Int. j. of Social Science and Economic Research, vol. 8, no. 9, Sept. 2023, pp. 3009-3020, doi.org/10.46609/IJSSER.2023.v08i09.040. Accessed Sept. 2023.
APA 6
Cynkin, C., & Leddo, J. (2023, September). TEACHING STUDENTS TO SELF-ASSESS USING COGNITIVE STRUCTURE ANALYSIS: HELPING STUDENTS DETERMINE WHAT THEY DO AND DO NOT KNOW. Int. j. of Social Science and Economic Research, 8(9), 3009-3020. Retrieved from https://doi.org/10.46609/IJSSER.2023.v08i09.040
Chicago
Cynkin, Celeste, and John Leddo. "TEACHING STUDENTS TO SELF-ASSESS USING COGNITIVE STRUCTURE ANALYSIS: HELPING STUDENTS DETERMINE WHAT THEY DO AND DO NOT KNOW." Int. j. of Social Science and Economic Research 8, no. 9 (September 2023), 3009-3020. Accessed September, 2023. https://doi.org/10.46609/IJSSER.2023.v08i09.040.

References

[1]. Ahmad, M.&Leddo, J. (2023). The Effectiveness of Cognitive Structure Analysis in Assessing Students’ Knowledge of the Scientific Method. International Journal of Social Science and Economic Research, 8(8), 2397-2410.
[2]. Anderson, J.R. (1982). Acquisition of cognitive skill. Psychological Review, 89, 369-405.
[3]. Chaoui, N (2011) "Finding Relationships Between Multiple-Choice Math Tests and Their Stem-Equivalent Constructed Responses". CGU Theses & Dissertations. Paper 21.
[4]. de Ayala, R. J. (2009). The theory and practice of item response theory. New York: The Guilford Press.
[5]. de Kleer, J. and Brown, J.S. (1981). Mental models of physical mechanisms and their acquisition. In J.R. Anderson (Ed.), Cognitive skills and their acquisition. Hillsdale, NJ: Erlbaum.
[6]. Elbrink, L., & Waits, B. (Spring, 1970). A Statistical Analysis of MultipleChoice Examinations in Mathematics. The Two-Year College Mathematics Journal, 1(1), 25-29.
[7]. Frary, R. (Spring, 1985). Multiple-Choice Versus Free-Response: A Simulation Study. Journal of Educational Measurement, 22, 21-31.
[8]. Herman, J. L., Klein, D. C., Heath, T. M., & Wakai, S. T. (1994). A first look: Are claims for alternative assessment holding up? (CSE Tech. Rep. No. 391). Los Angeles: University of California, Center for Research on Evaluation, Standards, and Student Testing.
[9]. Leddo J., Boddu B., Krishnamurthy S., Yuan K., & Chippala S. (2017). The effectiveness of selfdirected learning and teacher-led learning on gifted and talented vs. non-gifted and talented students. International Journal of Advanced Educational Research, 2(6):18-21.
[10]. Leddo, J., Cohen, M.S., O'Connor, M.F., Bresnick, T.A., and Marvin, F.F. (1990). Integrated knowledge elicitation and representation framework (Technical Report 90?3). Reston, VA: Decision Science Consortium, Inc.
[11]. Leddo, J. & Kalwala, S. (2023). The Effectiveness of Self-directed Learning and Teacher-led Learning of Advanced Subject Matter on Gifted and Talented Students. International Journal of Social Science and Economic Research, 8(9), 2819-2825.
[12]. Leddo, J., Li, S. & Zhang, Y. (2022). Cognitive Structure Analysis: A technique for assessing what students know, not just how they perform. International Journal of Social Science and Economic Research, 7(11), 3716-3726.
[13]. Newell, A. and Simon, H.A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice Hall.
[14]. Nittala, T., Leddo, J. & Nittala, H. (2022). The Effectiveness of Self-directed Learning Vs. Teacher-led Learning of Advanced Subject Matter on Gifted and Talented Vs. Non-gifted and Talented Students. International Journal of Social Science and Economic Research, 7(8), 2619-2627.
[15]. O'Neil Jr., H., & Brown, R. (1997). Differential Effects Of Question Formats In Math Assessment On Metacognition And Affect. Applied Measurement in Education, 331-351.
[16]. Quillian, M.R. (1966). Semantic memory. Camridge, MA: Bolt, Beranek and Newman.
[17]. Schank, R.C. and Abelson, R.P. (1977). Scripts, Plans, Goals, and Understanding. Hillsdale, NJ: Erlbaum.
[18]. Schank, R.C. (1982). Dynamic Memory: A theory of learning in computers and people. New York: Cambridge University Press.
[19]. Zhou, L.N. & Leddo, J. (2023). Cognitive Structure Analysis: Assessing Students’ Knowledge of Precalculus. International Journal of Social Science and Economic Research, 8(9), 2826-2836.

ABSTRACT:
In previous papers, we have reported an assessment technique called Cognitive Structure Analysis (CSA) that is designed to assess the concepts that people have, not just how well they can give correct answers to questions. Types of knowledge covered by CSA include facts, procedures, problem solving strategies and rationales (why things work the way they do). Experimental tests of CSA have showed high correlations between the assessments of student knowledge and how well students perform on problem solving tasks. The present paper explores whether students can be taught how to use CSA to self-assess the knowledge they have about a topic they have just been taught. 16 students attending a gifted and talented high school were initially given instruction on how to self-assess using CSA. They were then given a lesson in calculus and asked to self-assess their own knowledge. They were then given a problem solving test that required knowledge of the topic they were just taught. Self-assessment protocols were evaluated with items listed in the protocols assigned to one of five categories: knowledge believed by the student to be relevant but not actually relevant; knowledge that the student knew was relevant and that the student did know; knowledge that the student knew was relevant but the student knew s/he did not have; knowledge that the student knew was relevant and believed s/he had but actually gave the wrong information; knowledge that was actually necessary to solve the problems but the student did not mention at all. Results showed that students’ self assessments, on average, were 92% accurate, meaning of the knowledge required for problem solving, either students both knew the relevance of the knowledge and gave the correct information (“knowing what you know”) or they recognized that the knowledge was relevant and that they lacked the knowledge (“knowing what you don’t know”). Further, when comparing these five knowledge categories to the problem solving scores, only the two categories that demonstrated correct self-assessment (“knowing what you know” and “knowing what you don’t know”) showed statistically significant correlations. These results indicate that students could be able to determine what knowledge they lack for effective problem solving and that any gaps in their knowledge that they cannot effectively self-assess may not impair learning.

IJSSER is Member of