Sir Ken Robinson told a group of Texas educators recently that the problem with standardization in education is that people “don’t come in standard versions.” Simple. Common sense. Square peg, round hole. People are unique. Standard education doesn’t account for that fact.
So what to do?
Since it’s not likely that people will come in standard versions anytime soon, we need to change the educational environment so that it takes “proper account of the different talents of students.” Educational systems should be designed to preserve the intrinsic motivation for learning with which children are born. They should encourage the self-determination that leads to a life of purpose, productivity, and personal responsibility.
There is an opportunity at hand this year for Texas to influence system design for the better in education.
The Texas Commission on Next Generation Assessments and Accountability held its first meeting in January. Created by legislation passed by the 84th Texas Legislature and charged with studying and making recommendations for changes in the state’s testing and accountability system, the commission will meet six times this year before filing a final report in September.
What direction should the commission take?
In his latest book, Creative Schools: The Grassroots Revolution That's Transforming Education 1, Robinson makes the point that a problem with most current assessment systems is that they are too light on description to “convey the complexities of the process” they are meant to summarize. To enhance assessment, we need to expand the elements of description and the evidence upon which learning is evaluated.
Research is providing direction for this kind of work if only we can muster the will to think and act in different ways. A study published in the Journal of Educational Psychology last month, The Narrative Waltz: The Role of Flexibility in Writing Proficiency 2, could inform a discussion about next generation assessments.
Long story, short… the researchers found that context in the evaluation of learning matters.
Short story, longer… having longitudinal data to make decisions about student learning is critical; specifically, in this case, in evaluating writing proficiency. The researchers were trying to understand the extent to which high school students’ linguistic flexibility across a series of essays was associated with writing proficiency. Flexibility refers to students ability to adapt their writing styles and use various linguistic properties according to the specific context of the writing task; properties like cohesion, rhetoric, complex syntax, explicit argument, reported speech, contrasted ideas, and narrativity.
To analyze flexibility, the researchers utilized both natural language processing techniques and dynamic methodologies to capture variability in students’ use of narrative text properties across 16 argumentative, prompt-based essays.
Why narrativity as opposed to another linguistic property?
One, they needed a property to quantify and analyze, and narrativity, as an easier writing style for high school students to use, was likely to be more prevalent across multiple writing assignments. Two, they wanted to address what they believe is a common misunderstanding about the use of narrativity in essay writing. Research has shown that texts with more narrative elements tend to be easier to read and comprehend. Narrativity, therefore, is often assumed to be a sign of essay quality. But this assumption is not supported in the literature. The majority of research on essay quality suggests the opposite; that higher quality writing is associated with decreased levels of text narrativity.
In the current study, the degree of narrativity was assessed using a narrativity component score provided by Coh-Metrix, a computational text analysis tool that analyzes text at the word, sentence, and discourse levels. Coh-Metrix calculates the coherence of texts on many different measures. The Coh-Metrix narrativity Easability Component score, based on the results of a previous, large-scale corpus analysis, served as a measure of text readability based on the degree of story-like elements that were present within an individual text. To quantify students’ narrative writing patterns, the researchers applied random walks and Euclidian distances to create visualizations and classifications of students’ use of narrative properties across the 16 assigned essays.
The researchers found that it was students’ flexible use of text properties rather than the consistent use of a particular set of text properties (in this case, narrativity) that correlated with established measures of essay quality. The situational influence of narrative text elements on writing quality varies. It’s not just about frequency or consistency of use, but the writer’s ability to use appropriate techniques given the context. Flexibility informs writing proficiency.
Bottom line? Writing proficiency measures should account for a student’s sensitivity to multiple linguistic profiles, which can only be determined by evaluating multiple writing samples over a period of time.
The authors argue their findings suggest standardized test developers should “aim to develop more sophisticated assessments that can capture students' writing skills across a number of different contexts." In other words, when designing assessment systems, we need to think in terms of expanding the elements of description and the evidence upon which learning is evaluated.
The square peg and round hole are not irreconcilable. It’s a question of commitment to a process of developing “original ideas that have value.” Or what Robinson calls... creativity.
1 Robinson, K. & Aronica, L. (2015). Creative Schools: The Grassroots Revolution That's Transforming Education, New York: Penguin Publishing Group.
2 Allen, L. K., Snow, E. L., Mcnamara, D. S., Allen, L. K., Snow, E. L., & Mcnamara, D. S. (2016). Journal of Educational Psychology Proficiency The Narrative Waltz : The Role of Flexibility in Writing Proficiency.
No comments:
Post a Comment