do not necessarily reflect the views of UKDiss.com.
Teacher Experiences of Portfolio Based Language Assessment Implementation in Community Based ESL and LINC Programs
Teacher Experiences of Portfolio Based Language Assessment Implementation in Community Based ESL and LINC Programs
Starting in 2009, community based ESL (English as a Second Language) and LINC (Language Instruction for Newcomers in Canada) programs across Canada have been introduced to the Portfolio-Based Language Assessment (PBLA), a classroom-based, formative assessment tool developed in 2009 by Joanne Pettis (2014). This assessment tool was designed initially in response to a need for standardization of assessment within the LINC program, but was later introduced by Ontario’s Ministry of Citizenship and Immigration into the provincial ESL programs. As a result of its introduction within these contexts, PBLA has been discussed at great lengths at conferences and workshops of varying sizes, as well as featuring in practitioner journals such as TESL Canada and TESL Ontario’s Contact magazine.
This proposal describes a study to be carried out in a program in which teachers are currently using PBLA. The program is operated by the Continuing Education department of a school board and is situated in twenty-nine locations across the city, offering 179 classes during the day, evening and on weekends. The classes range in level from CLB 1 to CLB 8+, and offerings include LINC and ESL across all four skills, as well as Literacy and Basic Skills, and English for Specific Purposes courses.
The participants involved in the study include instructors who will be implementing and using PBLA completely for the first time in the 2017-2018 academic school year, while others are lead instructors; as such some instructors (specifically the lead instructors) are more experienced with the use of PBLA, while, as mentioned, other instructors will only be beginning the process of implementing PBLA through the initial piloting of a variety of elements in their classrooms. The difference between lead instructors and instructors is primarily that the lead instructors serve a support function within the structure of a program using PBLA. The lead instructors both train the remaining instructors and support the remaining instructors throughout the process of full implementation; as such, they are trained and begin to use PBLA within their classroom in advance of their program’s implementation. Within the program being considered in this study, the lead instructors’ training and use of PBLA begins a full year prior. This is an important point for two reasons; the first is that the differing experiences between these stages, when compared with the initial stages of both groups, could demonstrate either a contrast or a parallel shift in opinions, and the second is that this allows for a possible exploration of reasons beyond PBLA as an explanation positive and negative experiences.
The impetus for this research draws from the researcher’s experience as a language instructor participating in PBLA implementation, as well as the frustrations and challenges experienced by the instructors who have used, or begun using PBLA in their classes. As such, the primary audience for this study are the practitioners (i.e. language instructors), and the aim of the study will be to explore the experiences of instructors who have been implementing and using PBLA in their classes, as well as those have only begun to do so, to provide insight into the strengths of PBLA, the difficulties faced and ways that some of these obstacles may have been overcome, as well as to offer further evidence describing the use of PBLA in classrooms from the perspective of the instructor. This study hopes to contribute to the literature suggestions for introducing and implementing in PBLA and other standardizing tools in similar contexts by presenting the instructor perspective for the consideration of those involved in making such decisions.
Overview of PBLA
The PBLA guide is a document written by Joanne Pettis which describes PBLA. It provides “an overview of the theoretical foundations, principles, and assessment strategies that are fundamental to PBLA” (Pettis, 2014). It defines PBLA as “a comprehensive, systematic, authentic, and collaborative approach to language assessment that engages teachers and students in dialogue to tell the story of the student’s journey in learning English and meeting personal goals”; it is a “classroom- and teacher-based assessment approach that is integrated throughout the teaching/learning cycle” (p. 7). It defines portfolios as “collection[s] of samples of tasks or products” (p. 84). The PBLA guide also describes the samples that are collected as artefacts, and include teacher-assessed tasks, peer-assessed tasks, and self-assessed tasks.
The PBLA guide also describes the pedagogical grounding of Classroom-Based Assessment, the Canadian Language Benchmarks (CLB), and Assessment for Learning Strategies (AfL). These pedagogical foundations fit in with the PBLA in that they all represent aspects of portfolio use in a classroom setting in Canada, being that they would be used primarily for classroom-based assessment, can be tailored to fit within the Canadian language context, and, because of the aspect of, as (Pettis, 2014) describes, telling the story of a language learning journey, serves to encourage learners to take ownership of their language learning, and provides a more meaningful classroom experience which focuses more on the learning than on the assessing.
In addition to the guide, PBLA also includes the Language Companion (LC), synonymously called the portfolio. The Language Companion is the student binder that learners are given at the start of their learning experience. It contains a variety of published resources, as well as their portfolio proper, which includes a section for their baseline information (including needs assessment and autobiography), class notes, and the collection of assessment tasks across the four skills. There are three versions of the LC; Literacy, Phase I (CLB 1-4) and Phase II. Pettis (2014) argues that these portfolios are important to student learning as it helps to develop autonomy and responsibility.
Models for PBLA
Pettis (2014) states that she draws on two models in developing PBLA for the classroom – namely, the European Language Portfolio and the Collaborative Language Portfolio assessment in Europe and Manitoba respectively. The guide does not go into detail about these, but it is important to consider some of the major similarities and differences.
The European Language Portfolio. Proposed in 1991, the European Language Portfolio (ELP) is a document in three parts, a language passport where students summarize their cultural and linguistic identity; a language biography that includes learning targets, reflections and self-assessments; and a dossier which contains samples of the learner’s work. The intention of ELP is to keep track of learning as it happens and record achievements in language acquisition; developing language and cultural skills and facilitating educational and vocational mobility; underpinned by learner ownership, autonomy and self-assessment.
By the 2006-2007 school year, 2.5 million ELP’s had been distributed throughout the member states of the Council of Europe since 1991. In that same school year, roughly 584,000 were reported as being used by learners (Schärer, 2008, p. 5). These numbers have been considered a success by the Council.
Research has found that the ELP positively impacts education, and contributes to a deeper understanding and spread of European principles, goals and values. It values different learning styles and methods and can be considered an effective reporting tool for learner progress. It develops plurilingual competencies in learners and transcends grading systems across organizations and states (Little, Goullier, & Hughes, 2011). It must be noted that many reports for success include the qualifier “If used appropriately”. The word ‘appropriately is not defined across contexts, and is best understood as a self-declaration as opposed to quantitative evidence.
Research also acknowledges that the ELP has limitations. There are variables between various stakeholders that often escape control; coherence is difficult to maintain in a changing system, and priorities might shift over time. The change and challenge to traditional learning and teaching practices has not been perceived as comfortable. There is a need for clarification of ELP status and an identified imbalance between allotted resources and the goals of ELP. The demands between the curriculum and portfolio principles have been difficult to manage and this includes time in class to make good use of the ELP. As a way to counter some of these limitations, sustained ongoing seminars and workshops as support mechanisms are needed for both learners and teachers.
After 15 years of implementation, Little et al. (2011, p.5) recognize that, “the adoption and implementation of the ELP has still not reached the levels hoped for when it was first launched.” With almost 600,000 ELP’s distributed in the 2006-2007 school year, less than 100,000 learners were using them. In order for there to be greater success in the future, the Council intends to continue revising existing models, enhancing the registration process, and encouraging whole-school implementation, as opposed to class by class implementation. As Schärer (2008) states, “the overall level of penetration is so far not sufficient to reap the full benefits of ELP use. To prove beyond doubt that implementation is worthwhile and self-sustaining in the long term ELP projects need a certain scale” (p.4).
The primary differentiation from the Canadian model of PBLA is that by 2010 there were 188 validated and contextualized models of ELP developed by 28 member states and approved by the Council of Europe (Little et al., 2011). These differentiations consider the ages of the learners; cultural and language backgrounds, and professional and life situations, resulting in a variation of primary objectives. This contrasts from Canada’s three-phase LC model.
Collaborative Language Portfolio Assessment. The Collaborative Language Portfolio Assessment (CLPA) system was developed by Joanne Pettis for use specifically in Manitoba, and works similar to PBLA, but on a much smaller level. Pettis (2010) describes presents CLPA as a far less complicated assessment system than PBLA, stating that “Manitoba provides students with a Manitoba CLPA divider to insert into the binders they use in their language training” (p. 47). Interestingly, the CLPA relies on a single divider, which contrasts with the LCs which includes the full binder. The CLPA system is described as having many benefits to learners including real world relevance and enabling interactive and independent learning. Many of these benefits overlap with the benefits of PBLA, including the benefits for teachers, which are repeated almost verbatim in the PBLA guide.
The design of the CLPA portfolios takes the form of Learning portfolio which document work which demonstrates the development of skills over time. The portfolios are described as including a section for personal information and four additional sections, one designated for each of the four skills. These sections are referenced to the CLB 2000 competencies.
Review of Literature
Theoretical Framework: Portfolios in the Classroom
This study is grounded in theory surrounding portfolio assessment and how it functions as a tool for formative and classroom-based assessment practices. Portfolio assessment as used in the field of education dates back to Flood and Lapp’s (1989) article on reading portfolios and childhood education, in which they argue that the portfolio is a way of demonstrating progress in the learning process. While not addressing the learning field, Flood and Lapp’s conceptualization of portfolios as demonstrations of learning is represented in Pettis (2014), who remarks that portfolios enable learners to “‘see’ their progress by comparing their present language competence with competence displayed on entry to the class or program” (p. 8).
Pettis’s own characterization of portfolios draws on Ali’s (2005) article describing the e-portfolio as a tool for assessment in ESL and English as a Foreign Language (EFL) classes. The use of this descriptor of portfolio assessment does not seem to be a claim that e-portfolios are Pettis’ intention, and what is significant to this study’s interests are how portfolios are defined. Ali uses the McGraw-Hill Higher Education’s definition of portfolios, stating that they function as objects by being a place to collect representations of student work and progress, and as assessment in terms of continuous collection and assessment of student work. She then distinguishes between the formative and summative portfolio and explains that summative portfolios are also categorized as competency-based, based on negotiated learning, or biographical.
Another issue that arises in considering the literature thus far is to what degree the medium of portfolio creation affects the overall experience. Since Pettis (2014) refers to both Chang (2001) and Ali (2005) within the PBLA guide, despite PBLA being paper-based at the time of this writing, the guide suggests that it may be prudent to consider the paper-based and electronic portfolios ontologically and functionally related. The question remains, though, as to whether this is true in a community-based ESL context, and so the question of relevance of Ali’s and Chang’s articles is still a valid question.
Miholic and Moss (2001) describe the form and function of portfolio assessment, as well as discussing problems related to using portfolios as assessment tools. They argue that there are many benefits to using portfolios, stating that they are useful for “teaching writing and promoting revision” (p. 9). In their article, Miholic and Moss suggest that “many instructors who claim to use portfolios … merely ask students to assemble a folio” without engaging in learning, reflection, and assessment in a meaningful way. They conclude that what is required for a successful portfolio is “the ability and willingness to recognise what’s working or not working, to discuss needs, and reach consensus” (p. 11) – relating back to the idea of reflexivity, goal setting, and the idea, as Pettis (2014) calls it, of engaging teachers and students in a dialogue related to learning.
Portfolios have been used with success in education to varying degrees, and with varying results. For instance, Chang (2001) studied a college class where web portfolios are used with responses that suggest a generally positive opinion of the portfolio system. Chang’s study describes some of the characteristics of portfolios: they are dual-valued (they offer two-way interaction between teachers and students), selective (allowing students to choose what to include), authentic, reflective, individual, and interactive. Though Chang’s study focuses on web-based portfolios, his insight is still valuable, as one element of the study focuses on how these web portfolios impact learning, and Chang’s data on this is presented using a differential scale and an averaging of these scores. This data suggests that the overall opinion towards portfolio use within the context he was studying was positive. However, one concern that could arise with this study is that the context of the study is a university or college class, and so the results are not perfectly analogous to broader contexts. Since the target participants were students taking a “Computer and Instruction” course in a pre-service teacher training program at “some University (p. 442), and it is unclear whether this study was conducted at Chang’s home institution in Taiwan or in some other context, the obvious concern arises. As a study regarding portfolio assessment, and especially as one which Pettis mentions in the bibliography of the PBLA guide, the issue of context is worth raising.
Another element of portfolio assessment literature is, as Ali (2005) and Moya and Malley (1994) aim to do in their articles, offer guidelines for the implementation of a portfolio. Both articles take a similar approach for implementing portfolios (in Moya and Malley’s case) and e-portfolios (in Ali’s case) – the initial phase in implementing is defining the aim of the portfolio. The second element of portfolio implementation that is described by both articles is the planning phase which requires considering the audience and determining the content (Ali, 2005; Moya & Malley, 1994). This is an important operational step because it means understanding how much portfolios have a part in the context, and to what extent participation in and active use of the portfolio is required. Moya and Malley and Ali then outline the design of the portfolio as used for assessment purposes, including feedback and in-class assessment, where they discuss peer feedback and make suggestions for assessment periods. Specifically, Moya and Malley discusses criteria and standards in terms of data interpretation, while Ali briefly mentions and discusses peer correction and feedback mechanisms. Moya and Malley conclude these models with a requirement of evaluation and validation. Here, Moya and Malley (1994) suggest methods for conducting including concurrent and predictive validity studies, and go into some depth about these options.
To add to this, Kohonen (2000) discusses student reflections and language learning as a visible demonstration of language competence, in the context of the ELP (European Language Portfolio). He argues that the portfolio is a bridge between an instructor’s goal of fostering learning autonomy, and a student being able to demonstrate that autonomy. This is something that is present even in the PBLA guide (Pettis, 2014), and plays a significant role for Kohonen who states that there are specific expectations of the teacher if the process is to be a success. Specifically, he makes a definitive statement about the support that a teacher needs – support from their colleagues and in-service professional development throughout implementation
Studies of PBLA Use and Implementation
There have been some works published regarding PBLA implementation that address the piloting/field-testing phase, which offer recommendations for how to improve the process of implementation. Ripley’s (2012) study looks at the implementation of PBLA in LINC classrooms in a large Canadian city, and speaks to the experiences of piloting instructors, a PBLA developer, and a representative from CIC. Ripley identifies a number of challenges and issues surrounding PBLA implementation including the issue of support, contextual constraints such as continuous intake and program variation, and instructor training. One of the bigger challenges which arises is the lack of support, which is described by Ripley’s participant, Katherine, who says that “it was necessary to take the perspectives of instructors into account because PBLA would probably require some of them to undertake additional unpaid work” (p. 79). These are some of the comments that are expected to arise over the period of the implementation in progress with the program studied here, and are points that are expected to possibly emerge in the interviews with the instructors.
Another series of issues addressed by Ripley is that of continuous intake and teacher training. In terms of continuous intake, Katherine says that conducting needs assessments and distributing the LCs to new students every time they arrived is a logistical challenge, and one that poses problems for implementation if not addressed correctly. This is true of a community-based blended ESL/LINC program such as the program under consideration in this situation, and not simply because of the inclusion of a LINC program. The issue of continuous intake arises within the ESL program, and within the context of the program being studied here, continuous intake does happen throughout the term. Additionally, both programs also have issues with continuous departures, which raises a second set of challenges that may potentially emerge in this study.
On the issue of teacher training, Katherine says, “what happens when you get new teachers on staff…who haven’t had the [PBLA] training?” (p. 79). The question of teacher training is an important one – as Holmes (2015) describes in her article outlining ways to make PBLA sustainable, the team of people implementing PBLA need “regular opportunities to discuss their assessment practises and to learn from each other; professional development must be continuous, recognising that effective work requires ongoing commitment” (p.120). This study will consider, as part of the experience of PBLA implementation, the role of teacher-training, and the knowledge that teachers have of PBLA as they begin implementation.
Radivojevic’s (2014) study examines teacher responses to using the PBLA model during its field test phase for assessing ESL learner performance. Her study focuses on the context of a community college where she was working, and she focused on the contradictions she believes are inherent to this system. It was a qualitative study with interviews as the means of collecting data. She looked to the ELP for a historical overview of portfolio implementation and acknowledges that portfolio assessment is a long-term process and though the ELP model has been used for over 20 years, it has not yet been fully implemented. The findings of the research stated that there was an immense challenge for teachers when moving from the old, traditional test model for assessment, to the task-based assessment model of the PBLA.
One of Radivojevic’s (2014) proposals was that in order to ease teachers into the PBLA experience, the contradictions in the process and model need to be addressed. Teachers were asked 11 questions, and Radivojevic identified contradictions in implementation, then acknowledged that some of the contradictions were resolved over the course of the field test. She also identified the learning processes that the teachers took to resolve the identified contradictions. Radivojevic found 2 primary contradictions within the subject (Teachers); 5 primary contradictions with the tool (PBLA); 2 secondary contradictions and 1 tertiary contradiction. Not all of the contractions were analyzed.
Interestingly, it was noted that some of the contradictions were able to be overcome because of the collaborative and supportive nature of the implementation. Teachers worked with each other and were supported by materials developers and program advisers, and this was supplemented by weekly meetings. This relates back to the point made by Ripley (2012) and Holmes (2015) that support is a significant factor in the implementation of PBLA. Radivojevic (2014) also concluded that ongoing teacher support is imperative for successful implementation and continued improvement of PBLA, an idea which echoes the findings of other research that looks into the implementation process of portfolio based assessments in a second language environment.
In addition to the articles above stating that support was necessary, one other recurring theme is present; that of purpose. For instance, Ripley (2012) emphasises clarifying the purpose of the Language Companion. This idea is one which resonates throughout much of modern second language instruction and education in general – being able to ground pedagogy in a purpose helps learners set realistic expectations and develop learner autonomy, features which are important to learning (Cotterall, 2000; Nation & Macalister, 2010). Ripley (2012) suggests that resources need to be given specific purpose; he says that if the resources in the LC are meant to have some use, that instructors “need specific instructions and training on how this should be done” (p. 83). This comment suggests that there is a lack of clarity for how the LCs should be used, and what their purpose is.
Fox’s (2014) study of PBLA implementation in LINC classes identified four significant trends, only one of which was consistent with PBLA’s goals. She argues that PBLA use presented: “(a) increased planning and accountability; (b) the clearer articulation of goals for activity; (c) increased awareness of assessment; and (d) increased emphasis on summative assessment” (p. 81). In this study, Fox interviewed three LINC teachers, and collected data from one other, and while the study admits to being biased towards the data regarding the summative and formative uses of PBLA, it does admit that the other issues present themselves and are significant.
The recommendations made by all four of the PBLA related studies calling for better ongoing support and training is something that needs to be addressed. From the personal and anecdotal experience which drives the study comes the question of support, and something that this study will address during the data collection is the question of how support is understood, and whether the support expected matches the support received. These questions are both important because while Ripley states that support could be in the form of a PBLA expert, Radivojevic includes teaching partners and, in the instance of her specific study, other field test teachers. Holmes (2015) concurs and suggests that this support take place in, as mentioned earlier, staff informal and formal professional development, and includes colleagues, lead instructors, and unspecified others. The researchers in this study are especially suspicious of the expected versus actual role of support during implementation.
To address the issue of PBLA implementation generally, the scope of this paper is
exploratory, and aims to answer the following question:
- What is the experience of teachers in implementing PBLA in a community-based adult ESL classroom and program?
While Radivojevic’s (2014) study addresses a similar issue in regards to PBLA by looking at teacher experiences, her context is different, and so the question of contextualisation still needs to be answered.
Settings and Participants
This study is set in multiple locations of a community-based, school board operated, continuing education program. The program is part of the final implementation cohort, and the implementation began in 2015 with training for the lead instructors. The full implementation for this program is set for September 2017, with all active instructors having completed their training by April 2017. The participants for this study will consist of 2 instructors who have been implementing PBLA since their training in 2015 and 2 instructors who will have finished their PBLA training in March, and are in the interim period between training and implementation where they are encouraged to pilot elements of PBLA in their classes. This context and these participants were chosen to present a representative perspective of instructors across a more diverse community of practice, including classes at every level and across a variety of skill- and needs-specific courses. As such, participants in this study will include instructors who teach beginner students (CLB 0-4) and intermediate students (CLB 5-8), across skills (integrated four skills) or specific skills (conversation, or reading and writing), and across context (e.g. General English, Hospital English).
The participants will be recruited using a call for participants distributed throughout the program in question, with a call specifically for instructors who have either been using PBLA in their classes, or who have begun piloting elements of PBLA in their classes after having completed their training in March. With regards to the difference between these two groups, most of the instructors who have been using PBLA within the program were initially designated as lead instructors (instructors who are intended to be trainers for their colleagues in the use of PBLA), and it would be worth doing further research to compare the experiences if they turned out to be significantly different.
Data Collection and Analysis
Interview Design. Since this study is exploratory and seeking to answer the question of teacher experiences, the study will adopt a narrative inquiry approach and collect data through interviews. Concerning the interview design process, there are a number of factors that contribute to the questions and research instrument this study will use. The first significant factor is the researcher’s personal experiences. These experiences inform the questions about instructors’ understanding of the principles and theory informing PBLA. These questions to be designed will ask about the opinions and experiences of other instructors, and will aim to understand how they view PBLA as a consequence. The questions will address instructors’ views on both theory and practice, and will consider the experiences of training for and using PBLA.
As part of this proposal, preliminary questions have been included in the appendix to illustrate theinterview design. These questions will be piloted with colleagues who have been using PBLA and work outside of this program, and are identified as part of a separate implementation cohort under the PBLA.
The questions will ask the demographic questions as a warm up, and willfocus on what the instructors understand about the holistic picture of PBLA and what theyanticipate as the strengths and challenges of PBLA. Additionally, the interview questionswill ask instructors to locate themselves within the implementation process and describe theexperiences they have had thus far. The instructors’ understanding and experience ofimplementation support will also be explored within the constraints of the interview as it is aquestion that was raised in Holmes’ (2015) and Ripley’s (2012) articles.
Data Analysis. After conducting the interviews, the audio recordings will be transcribed and analysed using grounded theory, starting with the general area of interest being the experiences of the teachers. This is because, as Charmaz and Belgrave (2012) notes, “grounded theorists cannot identify the most significant processes beforehand,” (p. 348) and so the key themes are likely to emerge when looking at the transcribed interviews, and understanding them comparatively and iteratively. This analysis will be thematic, and focusing on emerging significant themes that may be identified during the interviews.
It is also important to acknowledge the fact that as an instructor in the PBLA implementation cohort identified for study, the researcher will have experience with PBLA as well as an understanding of the data presented in the interviews. This may affect how the data is evaluated, as the researcher’s opinion may be written into the analysis. To mitigate this impact, the points where the researcher’s opinion is included will be explicitly stated wherever possible.
The PBLA training sessions for instructors start in November 2016, and carry through to April of 2017. The interviews will take place in May and through part of June 2017. The interviews will be transcribed as they are collected and the analysis will be ongoing. The scheduling of the training sessions is only important insofar as the training sessions precede the interviews; this was mostly unintentional, but it frames the interviews at a point in time after the instructors have begun PBLA implementation.
Potential Implications for ESL Programs
This study is not meant to be an evaluation of the practice of PBLA, but rather, an exploration of the strengths and challenges of using the tool in the program described as perceived by and experienced by instructors. The possibility for future research with regards to a particular dimension of PBLA may emerge. For instance, though not the intention of the study, questions may arise regarding policy planning in future project implementation as a result of a series of comments made by instructors, and that may be its own topic of study.
On a more practical level, the findings may demonstrate a trend, either positive or negative, with regards to the experiences of instructors, that could offer encouragement that what they are experiencing is a shared experience ant that their concerns are valid, if only because the experiences will improve, or, equally possible, because there is a perceived problem with PBLA.
Potential Concerns and Ethical Issues
One major concern of this study is in relation to the role of the teacher as researcher and vice-versa. This dual role may be problematic because it could affect the analysis of the data, and may lead to a narrowed perspective in terms of the discussion of major themes. This may also be beneficial since, though the data will be analysed using grounded theory, to have personal experience may help in that it gives the data something to corroborate with. The strength of this project is that it enables the researcher to connect the literature and classroom experience to provide a more complex analysis of the data being collected, and opens the possibility of more in-depth questioning during the interviews.
Another concern was the issue of participant anonymity. Procedurally, the participants will be recruited and given a letter of consent to agree to or disagree to (see Appendix B), and the guiding principles for maintaining confidentiality and anonymity will be followed. The study is not framed as one which is politically charged, and is primarily designed to explore issues surrounding PBLA implementation in the classroom rather than the political side of designing and policy planning; however, anecdotal evidence may prove to be problematic and revealing. Should this arise during the interview with the interviewer recognizing it, data collection will be stopped until interview can be redirected to the target questions. If this anecdotally specific evidence is collected and is not caught until transcription, and if the segment of the interview does not address other questions in such a way that it can be omitted, it will be omitted. In the extreme case where the entire interview session is anecdotally revealing, the entire interview will be discarded.
An important issue related to both previous concerns is the role of the researcher-as-colleague and the participant-researcher relationship during the interviews. For the participants as colleagues, the temptation would be to treat the interview as a sounding board to express complaints and concerns about PBLA; however, this would not be beneficial for the study. At the start of each interview, the researcher will state that the purpose of the interviews is to learn about the participant’s experiences specifically, rather than expressing overall feelings towards PBLA, or relating to tangential points.
- Ali, S. Y. (2005). An Introduction to Electronic Portfolios in the Language Classroom (TESL/TEFL). The Internet TESL Journal, 11(8). Retrieved from http://iteslj.org/Techniques/Ali-Portfolios.html
- Chang, C. (2001). A Study on the evaluation and effectiveness analysis of web-based learning portfolio ( WBLP ). British Journal of Educational Technology, 32(4), 435–458.
- Charmaz, K., & Belgrave, L. L. (2012). Qualitative Interviewing and Grounded theory Analysis. In J. F. Gubrium, J. A. Holstein, A. B. Marvasti, & K. D. McKinney (Eds.), The SAGE Handbook of Interview Research: The Complexity of the Craft (pp. 347–365). Thousand Oaks: SAGE Publication.
- Cotterall, S. (2000). Promoting learner autonomy through the curriculum: principles for designing language courses. ELT Journal, 54(2), 109–117. https://doi.org/10.1093/elt/54.2.109
- Flood, J., & Lapp, D. (1989). Reporting Reading Progress: A Comparison Portfolio for Parents. The Reading Teacher, 42(7), 508–514. Retrieved from http://www.jstor.org.ezproxy.library.yorku.ca/stable/20200199
- Fox, J. (2014). Portfolio based language assessment (PBLA) in Canadian immigrant language training: Have we got it wrong? Contact, 40(2), 68–83.
- Holmes, T. (2015). PBLA: Moving Toward Sustainability. TESL Canada Journal, 32(9), 113. https://doi.org/10.18806/tesl.v32i0.1220
- Kohonen, V. (2000). Student reflection in portfolio assessment: making language learning more visible. Visible and invisible outcomes in language learning. Babylonia, 1, 13–19.
- Little, D., Goullier, F., & Hughes, G. (2011). The european language portfolio: The story so far (1991–2011). Strasbourg. Retrieved from https://rm.coe.int/CoERMPublicCommonSearchServices/DisplayDCTMContent?documentId=09000016804595a7#search=little goullier hughes
- Miholic, V., & Moss, M. (2001). Rethinking Portfolio Applications and Assessment. Journal of College Reading and Learning, 32(1), 5–13.
- Moya, S. S., & Malley, J. M. O. (1994). A Portfolio Assessment Model for ESL. The Journal of Educational Issues of Language Minority Students, 13(Spring), 1–16.
- Nation, I. S. P., & Macalister, J. (2010). Language Curriculum Design. New York: Routledge. https://doi.org/10.1136/bmj.326.7383.268
- Pettis, J. C. (2014). Portfolio-Based Language Assessment (PBLA): Guide for Teachers and Programs. Ottawa: Centre for Canadian Language Benchmarks. Retrieved from http://www.language.ca/documents
- Radivojevic, V. (2014). Using Cultural-Historical Activity Theory (CHAT) to Examine English as a Second Language (ESL) Teachers’ Experiences with Portfolio-Based Language Assessment (PBLA): A Case Study. Simon Fraser University.
- Ripley, D. (2012). Implementing Portfolio-Based Language Assessment in LINC Programs: Benefits and Challenges. TESL Canada Journal, 30(1), 69–86.
- Schärer, R. (2008). European Language Portfolio: Interim report 2007. Strasbourg. Retrieved from https://rm.coe.int/CoERMPublicCommonSearchServices/DisplayDCTMContent?documentId=09000016804595a6
Preliminary Interview Questions
Warm Up Questions
- What are the designated CLB levels for the classes you teach?
- Could you describe the classes you teach? (Context, skills)
- Before having to practice PBLA in your class, how would you describe your approach to teaching?
Questions about Instructor Understanding and Perception of PBLA
- What do you understand as your role within the program’s implementation of PBLA?
- Based on your understanding of PBLA, why was it implemented within the program?
- Based on your understanding of PBLA, what are the principles or theories that you feel support its use in the program?
- From the PBLA training sessions, what is something that you feel you have learned that you will be/have been able to take advantage of in your implementation of PBLA?
- Have you begun using/piloting PBLA in your classes?
- Since starting with PBLA implementation (whether training, piloting, or full use), what do you feel are some of its potential strengths? How have/will you take advantage of these strengths?
- Since starting with PBLA implementation (whether training, piloting, or full use), what do you feel are some of the challenges you may/may have face in using it in your classes? How do you plan to respond to these in a way that makes sense for your classes?
For Instructors who have Begun Piloting/Implementing PBLA
- What elements have you started using?
- Did you feel that there was a process or natural progression for your implementation? If so, what was it?
- How has your experience with the PBLA implementation process changed your understanding and opinion of the PBLA practice (if at all)? Can you describe some of the defining experiences that speak to how you feel about the PBLA practice now?
For Instructors who have not Begun Piloting/Implementing PBLA
- What do you anticipate will be challenges or obstacles you will face when you’re implementing PBLA in the classroom for the first time in September for yourself, your coworkers, your managers, and your site?
Questions about Support
- What is your understanding of the support system and support-feedback loop in PBLA?
- Do you feel you have been supported since you’ve started practicing PBLA? If so, by who, and where?
- Do you feel have been supported outside of the PBLA training sessions? If so, how?
- Did the amount of support you received in each phase meet your expectations or understanding of the role of support? Why or why not?