Written By: Audrey Gardner
The day after the International Symposium on Literacy as Numbers: Researching the Politics and Practices of Literacy Assessment Regimes in London, I flew back to Calgary saturated with happy thoughts from all the event’s provocative conversations, presentations, and discussions about the implications of large-scale measures (international skill surveys) on the meaning of adult literacy at both the policy and practice levels. It was a long way to travel for a one-day event but the topic was so close to my own research that I had to go. I am exploring how so-called assessment regimes—that is, measurement technologies on a global scale such as those commissioned by the Organisation for Economic Cooperation and Development (OECD) — influence adult literacy policies and programming. It was definitely worth the journey.
Here in Canada there is little opportunity to participate with colleagues in constructive discussions about how government’s emphatic uptake of the OECD surveys has had an enormous impact on the teaching and learning practices in programs. I am not talking about the pressures of working in an under-resourced corner of adult education, but about having a forum to really look at power relations in adult literacy and question how literacy is increasingly being valued as a workforce skill and not as an aspect of broader and more complex daily life. What does it mean that we use statistical language (e.g., 42 per cent of Canadians are semi-illiterate) to advocate for literacy but at the same time critique the calculated “problem” of adult literacy in Canada? My poster presentation Counting out who can’t be counted on The new norm in adult literacy looked at the implications of using test scores in international surveys to determine who is productive and, more disturbingly, who is unproductive.
I came home ready to dive into writing about my newly gained knowledge but just as I recovered from jetlag my household was evacuated. A few blocks from our house the Bow River was raging and water was spurting out of street sewers and brimming over the river banks. We had to gather what we could, pets and all, and quickly move to higher ground. My suitcase was still sitting in my bedroom, barely unpacked. I quickly threw in some more clothes and my notes from the symposium. I somehow thought that I could keep my mind on all the new knowledge I had gained and find a place where I could settle into writing while waiting out the evacuation. Grumbling about how inconvenient being evacuated was — as far as I knew the river had never flooded before — I was determined to stay focused on the amazing experience I’d had at the symposium and fold all of what I’d learned into my research.
The next morning I stood on the hilltop looking down at the water surrounding my house I could feel my mind shut down. When I started thinking again it was about wanting to be with my family and wanting to know if my neighbours and friends were okay. The musings about the symposium were gone. I was overwhelmed with sadness. After a couple of days we were able to start bailing out the watery mess inside our houses. By the sixth day of seemingly never-ending cleaning and clearing out the muck and mud, it felt like my symposium immersion had been swept away with the floodwaters. It is now six weeks after being evacuated. The Bow River has receded, contained within its newly carved-out banks and I am once again determined to get my mind back to where it was before the flood. Writing this is my attempt to paddle back to thinking about the disturbing concept of literacy as numbers.
The symposium focused on the underlying divergent and competitive beliefs about the meaning and value of literacy. Mary Hamilton talked about what happens to our understanding of literacy when it’s translated into the statistical language of international surveys that inform labour market policies, not learning assessments in classrooms and programs. She says that
1. Large-scale enumerative projects of literacy assessment are increasingly global in scope and affect educational policy and practice.
2. We need productive critical debate between academics, practitioners, research students, and policy-makers about the implications and the social, political, and scientific contexts within which these surveys are being carried out. We also need to really think critically about the ways in which literacy is being conceptualized as numbers.
3. It’s crucial that we apply critical policy, ethnographic, and sociological research perspectives to literacy assessment.
Much more than arguing about whether literacy is a human right or a set of human capital skills, “Literacy as Numbers” was a forum to critically question what measurement is doing to how we understand what literacy actually is.
I use the term measurement to mean institutionally arranged concepts and methods of making judgments about people’s literacy. Institutional measurement values scientific methods over individual/personal knowledge and local, cultural contexts. Institutional measurement is made up of testing practices, statistical calculations, elaborate apparatus, and technical expertise. Sotiria Grek from the University of Edinburgh said in her presentation that “Testing is important because it produces numbers and consequently ratings and rankings; once the OECD has created this unprecedented spectacle of comparison in European education, no system can remain hidden and separate any longer. The field of measurement becomes instantly the field of the game.”
Measurement of literacy in many countries, including Canada, has adopted the use of high-cost, highly technical international surveys such as IALS and PIAAC over other assessment frameworks. Comparing survey results between countries, regions, and groups of people may be valuable information, but when numbers become the dominant story of literacy, the knowledge of learning and teaching embedded in program practices becomes submerged. Learners as knowing actors become objects of the so-called literacy problem. It is disappointing that the IALS and the upcoming PIAAC surveys represent a watershed in the adult literacy field. They have enabled an imposition of a ‘global’ but limited form of assessment. Now we describe literacy in a de-contextualized frame of literacy levels and percentages of literacy deficits. These global measures are much more than a survey — they have unquestionably reframed the meaning of literacy.
In Canada there has been limited support for thoughtful critique of the shift toward privileging these OECD measures. Perhaps that is why nearly six participants of approximately thirty people at the symposium were from Canada. We need a Canadian forum for constructive critical dialogue on the implications of the current political fixation with these literacy-assessment regimes.
Many of the discussions at the symposium focused on how politics and large-scale international measures such as the OECD international adult literacy surveys are awkward bedfellows that spawn policies to fasten adult literacy programs to narrow and unrealistic outcomes. Submerged under the statistical language of these measures is the misfit of policy objectives with actual assessment practices in literacy programs. How relevant is the IALS statistic that claims “40 percent of Albertans aged sixteen and older struggle with their ability to understand and use information from texts”? 
During the first two intense weeks following the flood, information freely flowed among my friends and neighbours about bailing water out of soggy houses and gardens, how to prevent mould, and where to find disaster relief. After I went back to work and students started returning to the literacy programs, they talked about how they experienced the Calgary floods. Like me, they learned what to do through engaging with others. Text was definitely part of this state of emergency, but we learned how to get and use information from multiple sources and in multiple ways. In everyday life, using information from texts usually also involves talking, listening, and doing — all of which intersects with reading, writing, and calculating. But standardized tests don’t capture such real-life scenarios— they don’t acknowledge how people get information from relationships. Without knowing the literacy levels of the hundreds of people helping each other in our waterlogged community, I witnessed over and over again that the ability to understand and use information from texts is a social practice, not an isolated act cut off from the socio-cultural world we live in. Literacy, like water, is fluid and uncontainable. To use Francis Ponge’s poem “Water” to talk about literacy, I’d add that it “eludes all definition, yet leaves its traces in my mind, on this paper – formless blots.”
 CBC, The National (2006). Canada’s Shame. In-depth Education Report. May 24, 2006. Retrieved July 31, 2013 from http://www.cbc.ca/news/background/education/canada-shame.html
 Sotiria Grek, “The OECD as a Site of Co-production: European Education Governance and the New Politics of ‘Policy Mobilization’,” paper presented at the International Symposium on Literacy as Numbers : Researching the Politics and Practices of Literacy Assessment Regimes, London UK, 17 June 2013.
 OECD commissioned three international adult literacy surveys of mostly western/industrial countries. The two mentioned here are IALS-International Adult Literacy Survey, conducted in the early 1990s and PIAAC-Programme for International Assessment of Adult Competencies. PIAAC results will released in October 2013. The other survey is called ALLSS – Adult Literacy and Life Skills Survey, which was completed in 2005.
 Government of Alberta, Living Literacy: A Literacy Framework for Alberta’s Next Generation Economy, 2009, 1.
 Francis Ponge, The Nature of Things, translated by Lee Fahnestock (New York: Red Dust Books, 1995).