Rabu, 22 Desember 2010

Grammar, Writing, and Technology: A Sample Technology-supported Approach to Teaching Grammar and Improving Writing for ESL Learners


VOLKER HEGELHEIMER
DAVID FISHER
Iowa State University
Abstract:
English language learners are frequently unable to benefit from the prevailing process-writing approaches due to a lack of grammar and vocabulary knowledge relevant to academic writing. This paper describes how the need for explicit grammar instruction as part of preparing students to write can be addressed by using a collection of learner texts and transforming that collection into an online grammar resource for intermediate nonnative speakers (NNS) of English. Drawing on research in grammar and writing, the use of learner texts, and online interactivity, we outline the development and the prototype of the Internet Writing Resource for the Innovative Teaching of English (iWRITE). We discuss how the judicious use of advanced technology (e.g., XML) facilitated the implementation of iWRITE, an example of one possible approach to embodying aspects of second language acquisition (SLA) theory while taking advantage of the Webs potential for interactivity.
KEYWORDS
ESL Writing and Grammar, Learner Corpus, Web-based Resource Development, XML/XSL, Interactivity
INTRODUCTION
Despite participating in courses specifically aimed at improving the writing proficiency of English as a second language (ESL) learners, nonnative speakers (NNS) are frequently not prepared to produce acceptable academic writing (Hinkel, 2004). Hinkel (2002) points out that, among other problems, the relative absence of direct and focused grammar instruction, the lack of academic vocabulary development, and the exclusive use of a process-writing approach contribute to this problem. Even high intermediate and advanced NNS do not have the grammatical
257
and lexical wherewithal to benefit from the process-writing-teaching approaches. Thus, researchers (Hinkel, 2002, and others) recommend to specifically include grammar and vocabulary relevant to academic writing in the curriculum of writing classes for NNS. The availability of advanced technology coupled with recent research dealing with learner texts allows for the creation of systems specifically designed to address learner needs (Kuo, Wible, Chen, Sung, Tsao, & Chio, 2002; Wible, Kuo, Chien, Liu, & Tsao, 2001). An ideal platform for implementing these recommendations into functional systems is the World Wide Web (WWW).
In this paper, we draw on research in the area of grammar in writing approaches and suggest that technology can be instrumental in creating an innovative online grammar resource aimed at raising learner awareness of troublesome grammatical features. In particular, we show how, by harnessing the capabilities of technology and implementing the principles of computer-assisted language learning, learner texts can be transformed and integrated into an effective online resource. In doing so, we proceed as follows: First, we reiterate and highlight the need for including grammar instruction as part of ESL writing courses, review work that has been done to date using learner corpora to assist with such instruction, suggest features to be included in a Web-based resource based on information derived from an interactionist view of second language acquisition (SLA), and review existing writing systems. Second, we outline four stages used in the development of the Internet Writing Resource for the Innovative Teaching of English (iWRITE), describe the system's components, and give examples of its pedagogical uses. In the last part, we propose empirical research to evaluate the usefulness of this Web application.
WRITING AND GRAMMAR
Hinkel (2004) points out the mismatch between what is taught and what can be accomplished by intermediate- and advanced-level ESL writers. Often, she argues, “intensive, individualized help with sentence-level syntax […]” is needed despite the explicit grammar instruction learners have received. Since learners frequently do not have the competence they need, they are required to enroll in ESL writing courses. However, even these courses fail to adequately prepare NNS for the academic writing expected of them. One important concern is that since the 1980s writing classes have shifted away from a product approach to embrace a process approach to writing (Hairston, 1982). While important for the personal development of the learners, “the new instructional methodology centered squarely and almost exclusively on the writing process that fundamentally overlooked the fact that NNS writers may simply lack the necessary language skills (e.g., vocabulary and grammar) to take advantage of the benefits of writing process instruction” (Hinkel, 2004, p. 9). A related problem accompanying writing process instruction is the change of focus, whereby meaning and overall success in communication receive exclusive attention at the cost of accuracy (Williams, 1995 as cited in Granger and Tribble, p. 13; James, 1998). This lack of the required range of lexical and grammar skills for successful academic writing has been investigated by numerous researchers (e.g., Nation, 1990; Raimes, 1983; Read, 2000; Vann,
258
Meyer, & Lorenz, 1984; Vann, Lorenz, & Meyer, 1991). The findings reported in these investigations play an important role in the design and creation of the type of resource presented in this paper.
In addition to these concerns, it is the product, not the process that is evaluated in academic testing situations in which students are asked to produce written texts, such as for assignments in most (if not all) higher-education classes—except writing classes. Strikingly, even in most placement test situations in English, only the product (i.e., the essay) is evaluated, while the teaching approach remains process oriented.
A distinct, yet related aspect of process-writing approaches is that they integrate peer editing. Research (e.g., Hyland, 2002; Hinkel, 2004) supports classroom experience that peer editing, while often perceived as helpful, may not lead students to improved error awareness and error recognition. Helping learners focus on errors typically committed by learners from a particular L1 can raise the awareness of such problem areas and facilitate the detection (and prevention) of certain error types. In fact, learners often want to focus on form and wish for a pedagogical tool to serve as a reference and an easy-to-use resource. Nevertheless, the exclusive use of model texts that are not accessible to students is viewed skeptically by students and may lead to unrealistic expectations.
What is needed is direct instruction coupled with explicitly pointing out mistakes in essays written by language learners. Hinkel (2004) calls for innovative ways of teaching rather than more of the same. Recent development in the area of corpus linguistics in general and in working with learner corpora in particular, as well as advances in technology, may be ideally suited to play a key role in reinventing (or at least supplementing) grammar teaching as part of a writing course. Each is discussed in turn below.
LEARNER CORPORA
Since being called a revolution in applied linguistics in the early 1990s (Granger, 1994), learner corpora have become a major source for learning about various errors, including L1 interference errors, particularly in ESL writing. One major project, the International Corpus of Learner English1 (ICLE) consisting of argumentative writings by ESL learners from different countries, provides learners with access to not only an error corpus, but also to a comparison group corpus consisting of essays written by native speakers (NS) of English (Virtanen, 1996).
This type of research frequently informs pedagogy. For example, Granger and Tyson (1996) looked at the overuse of connectors, which they hypothesized stems from teaching learners lists of supposedly interchangeable connectors. Using a fairly large corpus of over 1,000 texts, Hinkel (2003) looked at the level of complexity exhibited by advanced NNS and compared it to texts written by NS. She found that significantly more markers of simplicity or “basicness” such as the be-copula or vague nouns were present in essays written by NNS. These learner corpora have been used to shed light on various aspects of learner language, including the use of connectors (Milton & Tsang, 1993), adjective intensification (Lorenz, 1998), adverbial connectors (Altenberg & Granger, 2002), overpassivization
259
errors (Cowan, Choi, & Kim, 2003), and syntactic and lexical constructions in academic writing (Hinkel, 2003). Other contributions highlight the importance of the corpus design (Granger, 1993; Meunier, 2002) and the possibilities for the creation of “corpus-informed learning materials” (Granger & Tribble, 1998).
In order to transform these learner corpora into useful learning and teaching tools, we must draw from the current research in CALL and online interactivity. The next section situates the interactionist theory of SLA within the more general discussions of online writing and pedagogical interactivity. In doing so, we provide a heuristic for the development and assessment of online tools.
CALL, WRITING SYSTEMS, AND WEB INTERACTIVITY
Phinney (1996) realized the importance of technology in writing and recognized the following paradigm shift: “As part of the changing culture of composition instruction, there is a new emphasis on de-centering authority, coupled with a recognition of the importance of collaborative learning, and a realization of the need for new models of writing and rhetoric” (p. 140). A gradual shift from word processing to collaborative writing in the late 1980s to mid-1990s necessitated the development of tools to accommodate this shift in pedagogy.
However, writing systems were often developed by writing teachers in response to a lack of appropriate writing tools (Phinney, 1996). This led to the creation of more collaboratively oriented writing environments such as the Daedalus Integrated Writing System and Prep Editor. The focus of these tools was in line with the predominant process approach to writing and, therefore, teachers or peers used these tools mostly to make organizational and rhetorical comments.
Milton (1998) outlined an electronic resource aimed at creating “electronic language learning” experiences. He described how a comparison of a nonnative learner corpus, called interlanguage corpus, with a NS corpus could inform the creation of “electronic exercises, tutorials, and tools” (p. 186). Cowan et al. (2003) discussed one example of a comprehensive electronic tool. Their extensive CALL program, ESL Tutor, is aimed at “investigating whether persistent errors can be eradicated” (p. 457).
Since the widespread availability of the Web and numerous Web- and computer-based writing systems, Wible et al. (2001) noted that “content providers often end up accommodating their content to existing systems rather than imagining first how the technology should be designed to accommodate the needs of the content and the learners” (p. 298). Maddux (2002), noting the exponential growth in the number of Web-based educational systems, attributed part of the failure of Web-based instruction to a lack of effective interactivity, which he called “the most promising, yet scarce characteristic that can be built into Web pages” (p. 10). Maddux distinguished between two types of uses of technology. Type I uses “make it quicker, easier, or more convenient to teach in traditional ways while Type II uses make it possible to teach in new and better ways that are not otherwise available” (p. 10). Similarly, Wible et al. argued that Web-based writing environments should be developed “expressly to meet the unique needs of particular learning domains in ways that traditional classrooms can not” (p. 298). Kuo
260
et al. (2002) described the Intelligent Web-based Interactive Language Learning (IWiLL) system they developed to address these needs.
The significant features these more recent resources have in common are that they are built on or around learner texts (a learner corpus), that they are searchable, and that they are Web-based. Also, the tools in these resources put more emphasis on grammatical and lexical errors rather than on organizational and rhetorical problems. Finally, the systems attempt to simultaneously address learner needs (e.g., appropriate level of difficulty, clear feedback, and accessible metalanguage), teacher needs (e.g., elimination of repetitive tasks, increased learner independence, and identification of error patterns), and researcher needs (e.g., tracking student use of the system).
One theoretical framework that can serve as a basis for the development and assessment of an online resource that integrates grammar, writing, and the use of learner corpora is the interactionist theory of SLA. Focusing mainly on the role input and interaction plays in instructed (or classroom-based) settings (Pica 1994; Long, 1996; Gass, 1997), the hypotheses in the interactionist theory are pertinent to the design of CALL activities and resources. Acquisition occurs only when linguistic input becomes intake, that is, is comprehended syntactically and semantically by the learner. Noticing linguistic input is viewed as a prerequisite for acquisition (Schmidt, 1990), and noticing is more likely to occur during interaction. Hence, software features that enhance noticing in general and that help the learner to focus on form (FoF) (Long, 1991) are viewed as beneficial. Chapelle (1998) proposed seven criteria for the development of multimedia CALL based on hypotheses that derive from interactionist-based research:
1. make linguistic characteristics salient,
2. help learners comprehend semantic and syntactic aspects of input,
3. learners need to be able to produce output,
4. learners need to be able to notice errors in their output,
5. learners need to correct their linguistic output,
6. target language interactions need to be modifiable for negotiation of meaning, and
7. learners need to engage in L2 tasks designed to maximize opportunities for good interaction.
Chou (2003) sought to assist those developing what Maddux called Type II uses of technology—or what we can conceive of as interactionist learning systems—by providing a list of interactivity dimensions culled from the past 15 years of research on instructional design. These dimensions help us envision how Chapelle's interactionist criteria can be concretely embodied in a Web-based system while also providing a rubric of sorts for assessing such a system's level of interactivity (see Table 1). Guided by these considerations, we describe in the next part the development, implementation, and anticipated use of iWRITE.
261
Table 1
Interactivity dimensions (adapted from Chou, 2003)
Interactivity dimensions
Brief description
Choice
Ability to access information of varying types (i.e., multimedia)
Nonsequential access of choice
Ability to choose route through information
Responsiveness to learner
System responds to users' requests quickly
Monitoring information use
System collects data about users and their use patterns. Users can access data about their use
Personal-choice helper
Information helps learner make better choice of content
Adaptability
System adapts learning experience to individual users
Playfulness
Information arouses curiosity and encourages learners to play and explore
Facilitation of interpersonal communication
Users (instructors and students) can communicate with each other online
Ease of adding information
Users (instructors and students) can add information to the system
RESOURCE DEVELOPMENT
Taking into consideration the issues surrounding the opportunity presented by the collection of genuine learner data in the form of placement essays, the advantages of learner corpora, and principles derived from SLA theory, the development of an appropriate Web-based resource also needs to include issues related to the Web environment to arrive at an application that truly transforms a learner corpus.
Project Development
Figure 1 provides an overview of the iWRITE system, which includes the learner corpus, documents and activities that support student/instructor interaction with it. For clarity, we have divided the process into four stages which correspond to the type of work undertaken on (or the instructional value we are adding to) the corpus. In each stage, the corpus remains at the center of the process, and the materials and activities that surround it serve to make the corpus useful to students and instructors by enabling the interactivity that characterizes the iWRITE interface.
Stage 1: Corpus and Database Design and Assembly
All essays selected for inclusion in the corpus were handwritten as part of an English placement test at Iowa State University on one of four different topics requiring
262
expository writing. The essays were rated by two independent readers who both agreed on the specific placement of students.2 Perfect interrater reliability was the primary criterion for selection. Once typed, the total collection of learner texts amounted to 45 essays, or 12,839 words. In total, 1,268 errors were identified and marked. The following information was also captured and/or prepared for entry into the relational database:
1. nationality, TWE score, and TOEFL scores of the writers of the essays;
2. essay topic;
3. contexts, solutions, and corrected contexts (all described below) for marked errors; and
4. pointers to Flash movies, Word documents (marked during “filming” of Flash movies), and reference (“Additional Information”) files.
Figure 1
Overview of the Creation of iWRITE
0x01 graphic
Stage 2: Learner Text Mark Up and Solution Production
At first, five essays were analyzed in detail, and the initial error categories were modified according to the actual errors found in the essays. Subsequently, the remaining 40 essays were marked using the coding scheme outlined in the Appendix, resulting in marked-up essays like the one illustrated in Figure 2. The error codes were derived from error codes currently in use at the university and modified to fit the errors exhibited by the learners in this subsample. In addition to grammatical errors, lexical errors, which Santos (1988) found to be considered the most serious errors by professors who evaluated nonnative writers, were also
263
included. The importance of focusing on both grammatical and lexical errors is also supported by findings reflected in other studies (Vann et al., 1984; Vann et al., 1991), in which lexical and semantic errors were found to be most problematic, particularly when committed by NNS. In subsequent versions of iWRITE, a display of errors based on error gravity will be considered, but the current incarnation does not assign weights to errors.
Figure 2
Example of a Marked-up Essay
0x01 graphic
Database Build and Load
In the next step, each error was put into a spreadsheet, along with identifying information, and one possible solution (see Table 2). However, many times, sentences contained multiple errors. Therefore, an error-free solution of the entire sentence (or context) was entered into the spreadsheet. The marking and entering was done by two different members of the research team in order to minimize errors and to double check the marking of the errors. After the marking was complete, the spreadsheet was loaded into a table in the relational database.
XML Mark Up: Creating Smart Documents
After the errors were uploaded into the database, the essays were marked up with tags developed using XML. A set of tags (technically known as elements within a document type definition) that represented each of the error categories (paragraph, sentence, word, determiner, and miscellaneous) was created. By identifying each error uniquely within the error-category tags, and therefore within the text of the corpus itself (i.e., by establishing the linkage between the corpus and the database), we were able to design iWRITE to
264
1. draw on the relational database table that contains one possible solution for the identified error as well as a corrected context, in which all of the errors in the text surrounding the marked error are corrected (these had been entered into spreadsheets and uploaded into the database as described above), and thus enable students to get solution information by clicking on a link in the essay; and
2. make available the “additional help” reference pages for each type of error from a variety of contexts.
Table 2
Contents of the Excel Error Spreadsheet
Column name
Brief explanation
Example
EssayID
Essay identifier
Spr0244 (i.e., Spring 2002, #04)
MainID
Main error category
Word-level error
SubID
Error description
misspelling
MainSubNum
Instance identifier, the Nth occurrence of the same error
1
Item
recently
ItemCorrect
The corrected form of the item. (Needed for identification purposes)
recent
Context
The most recently problem I met was just few days ago […]
Solution 1
The most recent problem I met was just few days ago […]
ContextCorrect
The corrected version of the entire sentence
The most recent problem I had just a few days ago […]
Figure 3 shows how these error tags look and how they correspond with the entries in the relational database. This activity allowed yet another examination of the texts to ensure the accuracy of the error marking. The significance of this mark-up system is described in “Stage 3: Corpus Transformation” below.
Figure 3
XML Mark-up Illustration
0x01 graphic
265
Video Recording
The research team also annotated Word versions of placement essays using the “Track Changes” feature. This activity, along with oral comments made by an annotator, was recorded using Camtasia, a program that allows users to capture and replay motion that takes place on a computer monitor. These audio/video files were then transformed into Flash movies to permit speedier delivery over the Web. The annotator did not have access to the marked-up version of the text. Rather, 5 minutes were allotted to allow the annotator to glance at the essay before making suggestions and corrections, which were often more qualitatively oriented and included praise and constructive suggestions rather than only syntactic and lexical corrections, mimicking an interaction between a student and an instructor while reviewing an essay.
Reference Page Creation
After the major error types were identified, the team created a number of reference, or “Additional Information” pages. These pages contain detailed explanations of the error, examples of how to fix the error, and links to websites where students could go for more information.
Stage 3: Corpus Transformation
An important part of creating layered interactivity lies in providing students with the ability to query the essays in various ways. In essence, the XML tags encode some of the expertise that has traditionally resided in instructors and makes it accessible to students.
XSL: Displaying Documents Smartly
Like all tags developed using XML, iWRITE's error-category tags contain semantic information only, not layout or other appearance information (as HTML tags do). To display the marked-up essays in a meaningful (and pedagogically effective) way, iWRITE employs a number of transformations to output essays in HTML so that students can view and interact with them. This output provides students a means of using the marks provided by the essay evaluators without displaying an overwhelming number of marks simultaneously. To provide this interactivity iWRITE uses XSLT (eXtensible Stylesheet Language for Transformations) to highlight errors of a particular category within an essay while providing links to solutions for the errors.
XSL (eXtensible Stylesheet Language) transformations involve a marked-up document (like the learner corpus), a transformation stylesheet, and software that creates a new document out of the two. The stylesheets in iWRITE contain a set of instructions about how to display each element (i.e., error type) for which a tag has been defined. The transformation software creates a new document that renders the data associated with each tag in the way that the stylesheet instructs. In other words, the transformations that occur in iWRITE produce HTML documents that appear in the students' browsers with certain error types highlighted and linked to solutions (see Figure 4).
266
Figure 4
Transformations on an Essay from the Learner Corpus
0x01 graphic
The XSL stylesheet (on the left) is combined with an essay from the learner corpus (on the right). The iWRITE software uses the XSL stylesheet to create an HTML page in which errors of particular types (e.g., paragraph, sentence, and word errors) are hyperlinked to solutions for those errors.
Stage 4: Corpus presentation: iWRITE; a smart corpus-based prototype
The homepage of the iWRITE application gives learners access to five main components: Solutions, Essays, Practice, Marking, and Corpus, and a logout option (see Figure 5).
Figure 5
iWRITE Homepage
0x01 graphic
267
The Solutions section provides learners with access to all marked-up errors contain in the learner corpus. Learners can select a specific error and look at all the instances in which that error occurred (see Figure 6).
Figure 6
Solutions Section
0x01 graphic
In addition to viewing the error, the context in which it occurred, and its solution, learners have the option of viewing the error in the context of the essay by clicking on the image in the left-hand column (see Figure 7).
Figure 7
Specific Errors and Solutions
0x01 graphic
When clicking on the error in the context of the entire essay, the program provides an error description, corrected context, and a link to additional information (see Figure 8). Additionally, for all word-level errors, the program includes a link to an online corpus.
268
Figure 8
Highlighted Error in the Essay
0x01 graphic
The Essays section provides learners with the opportunity for in-depth work with essays based on native country, essay topic, and TOEFL scores. Essays are initially displayed in unmarked form so that learners can choose an error category (word or sentence level) and see the errors highlighted, with the explanations of the errors appearing on demand in the right frame (see Figure 9). Here, both the solution for the specific error as well as the corrected context are presented. As in the Solution section, a link to additional information is provided at the bottom of the page.
Figure 9
Essay Viewer
0x01 graphic
269
The Practice section permits learners to generate worksheets in which the errors in one error category are highlighted (see Figure 10). While it is possible simply to complete the textboxes next to the errors and print them out, the recommended procedure is to create and download worksheets in Word format, whereby the errors remain salient through the use of font colors. Learners can then focus on specific error categories and attempt to correct individual errors. They can then save the worksheets for later use.
Figure 10
Practice Section
0x01 graphic
The Marking section allows learners to select essays and to watch and listen as an instructor annotates them verbally and electronically (using the Track Changes features in Word). A link to the marked-up version of the essay lets learners download the file for reference or discussion (see Figure 11).
Classroom Application
The iWRITE has immediate pedagogical applications in that it can be used to raise learners' grammatical awareness, encourage learner autonomy, and help learners prepare for editing or peer editing. In this section, sample classroom applications of each of the four major sections of iWRITE are outlined.
First, iWRITE's Solutions section can be used to help learners understand the terminology (or metalanguage) necessary to begin to ask specific questions about grammar, which is one important aspect of becoming an autonomous learner. The Solutions section presents the error terms and examples using appropriate grammatical terminology. The Essays section allows learners to dissect essays in layers
270
since they can look at different categories of errors at the word, sentence, or paragraph level. This section is ideally suited to classroom settings because it does not confront learners with an overwhelming number of errors at the same time. Plus, the essays are accessible by the writer's country of origin. Therefore, this section can be used to prepare for upcoming peer-editing sessions in that readers can review essays written by a writer from the same country as the one they will read during the peer-editing session. The Practice section can be used to generate worksheets as Word documents, which can be used in a small group activity in which each group member is responsible for finding (and correcting) specific mistakes at the word, sentence, or paragraph level. Upon completion, the individual members can collectively correct the essay and compare the errors they detected with the ones accessible through iWRITE. The last major section, the Marking section is aimed at encouraging learners to interact cognitively with the audio/video annotations of an essay. It can be used for peer-editing or error-detection exercises in which unmarked essays can be downloaded and marked up and corrected by learners who can then verify their choices using iWRITE.
Figure 11
Marking: Listen to and watch annotating in progress
0x01 graphic
Applications like iWRITE can also be utilized during teacher training. In particular, the Marking section holds promise especially for nonnative teachers since it is possible to observe model behavior of a writing instructor who is marking up an essay. Similarly, the other sections could be used in teacher-training classes in which the trainees would act as students while going through various essays trying to identify problems. This might be especially fruitful for future teachers who share the same L1 with their students and may be less likely to identify errors that their students could commit.
These are just a few potential uses of applications like iWRITE. Future development of this application will need to include more learner texts so that multiple essays from learners of specific L1s can be made available.
271
CONCLUSIONS
Building collections of online resources that focus on the needs of users is not a simple process (Calverley & Shephard, 2003). We envision our effort, then, as an attempt to create a prototype of what Maddux (2002) called a Type II system in which pedagogical value is added to a learner corpus by providing a number of different kinds of interactivity. As we took up the challenge of creating a Type II system, we decided to use a browser interface and Web pages, rather than a more proprietary model that might have been housed on a few computers in our language-learning lab. We made this choice for two main reasons. First, Hillman, Willis, & Gunawardena (1994) noted that the “extent to which a learner is proficient with a specific medium correlates positively with the success the learner has in extracting the desired information” (p. 32). Many of the students who will be using iWRITE have a good deal of experience searching the Web and working with browsers and thus should be comfortable working with a system that uses familiar Web conventions (e.g., links and back buttons). Second, we hope eventually to make this resource available to a number of teachers/learners around the world at no or minimal cost, so the Web seemed the ideal medium. If readers are interested in using the system, they should contact Volker Hegelheimer at volkerh@iastate.edu.
Next we worked to decide which kinds of interactivity would be most helpful in (a) enabling our students to achieve the learning goals set forth in the ESL class in which they would be using the system and by means identified in current SLA theory and (b) enabling us as researchers to determine how (or if) the system was effective in helping students with their language-learning efforts. Table 3, an expanded version of Table 1 above, relates Chou's (2003) interactivity dimensions to student needs and instructor goals and outlines how this is accomplished in iWRITE.
We view iWRITE as a prototype of smart, dynamic, and learner-corpus-based applications that will enhance language learning in the near future. In this paper, we illustrated one approach on how to transform a learner corpus into a sound online resource using theory-supported design features and an iterative, dynamic approach. This incarnation of iWRITE deals with predefined syntactic problems. However, the underlying architecture of this program can be used to address other problems as well, be they more rhetorical aspects of writing or writings composed by NS on a variety of topics.
While preliminary feedback from learners and teachers suggests that iWRITE is viewed as a potential asset for language learning, what needs to be examined in greater detail next is how language learners and language teachers perceive iWRITE in terms of its potential to transform learners' awareness of grammatical errors and their writing. Among the various notions driving this line of research, one ideal outcome would be to generate an automatic profile of a learner (e.g., Granger & Rayson, 1998). Since the creation of the first version of iWRITE in June 2003, the resource has been used by approximately 200 learners in intermediate and high-intermediate academic-writing classes at Iowa State University.
272
Table 3
Interactivity dimensions and ESL considerations
Interactivity dimensions
Needs of ESL students/Goals of instructors
System function (Interaction)
Choice
NNS may learn best through multimodal presentation of material (i.e., aural, visual, reading)
Audio/video movies of assessment; layered essay presentation; corpus look up; reference sources; worksheets
Nonsequential access of choice
Students with varying L1s and L1-specific problems; students with varying levels of L2 competence
Homepage with five choices for initial access; access to layered essays and solutions from multiple points within the system
Responsiveness to learners
Immediate, performance-based feedback encourages learning
Not an intelligent system in its current iteration; upgrade of hardware and software will become necessary at certain intervals
Monitoring information use
Need to correlate student activity on the system with writing/classroom performance
Elaborate tracking feature tracks learner access, which can be accessed and viewed directly or through report generating queries*
Personal choice helper
Need to help students find the content that would prove most helpful to them
Advice/instructions provided on each webpage
Adaptability
Activities at hugely different proficiency levels are ineffective
Not implemented as of yet; adaptability based on learner's interaction (e.g., searches) being envisioned
Playfulness
Need for students to examine a number of works/examples
Many essays; ability to explore various error types; dynamic, layered presentation
Facilitation of interpersonal communication
Need for students to work together in various interactions with tool (handled in classroom)
Handled in classroom through carefully assigned tasks and groups
Ease of adding information
Need to add each year's placement essays to corpus
Information is currently only addable by the savvy instructor; future iterations need to allow students to become active contributors
*Additionally, postuse feedback sheets combined with focused interviews complete the data-gathering phase of the program.
As is the case with other additions to the curriculum, the instructors are experimenting with various ways to integrate iWRITE into their curriculum and their classrooms. It is currently used to raise learners' grammatical awareness, to introduce metalanguage related to grammar, and to prepare for peer-editing sessions.
273
Indicative of how students perceive the resource is the following quote of one intermediate-level student: “When I revised my partner's essay I used iWRITE to help. We did it in class but I also did it outside of class. I think it helped, but I still think it's really hard to detect errors on my own.” The use of this resource also promises increased motivational appeal. During a semistructured interview, one student expressed his enthusiasm about the program by saying “I particularly like the marking component of the program. I love it! It feels like my tutor is sitting beside me.” Another student's remark (“When I peer-edit I look at paragraph level, sentence level, [and] word level now.”) hints at a positive analytical development in that the notion of a layered approach towards peer editing seems to be growing. However, while these reactions are promising, more research (e.g., Hegelheimer, in press) is needed before conclusions can be drawn.
We end by reminding readers that Chapelle (2001) proposed a three-tiered approach to CALL evaluation consisting of a judgmental (or logical) analysis of CALL systems and of tasks completed by learners engaged in such systems followed by an empirical analysis. In this paper, we focused on the judgmental analyses. Now empirical studies need to follow to evaluate CALL systems like iWRITE and the effectiveness of tasks students can and should engage in. We would like to invite researchers to make use of our system, to collaborate, and to conduct empirical investigations.
NOTES
1 The ICLE is being compiled at the University of Louvain in Belgium. A detailed description of this effort is presented in Granger (1993).
2 The raters had three choices: place learners in the first level of ESL writing instruction, place learners in the second level of ESL writing instruction, or exempt learners from taking ESL writing courses and recommend their immediate placement into regular composition classes
REFERENCES
Altenberg, B., & Granger, S. (2002). Lexis in contrast: Corpus-based approaches. Amsterdam; Philadelphia: J. Benjamins.
Calverley, G., & Shephard, K. (2003). Assisting the uptake of on-line resources: Why good learning resources are not enough. Computers & Education, 41 (3), 205-224.
Chapelle, C. A. (1998). Multimedia CALL: Lessons to be learned from research on instructed SLA. Language Learning & Technology, 2 (1), 22-34. Retrieved September 22, 2005, from http://llt.msu.edu/vol2num1/article1
Chapelle, C. A. (2001). Computer applications in second language acquisition. New York: Cambridge University Press.
Chou, C. (2003). Interactivity and interactive functions in web-based learning systems: A technical framework for designers. British Journal of Educational Technology, 34 (3), 265-279.
274
Cowan, R., Choi, H. E., & Kim, D. H. (2003). Four questions for error diagnosis and correction in CALL. CALICO Journal, 20 (3), 451-463.
Gass, S. M. (1997). Input, interaction, and the second language learner. Mahwah, NJ: Lawrence Erlbaum Associates.
Granger, S. (1993). International corpus of learner English. In J. M. G. Aarts, P. D. Haan, & N. Oostdijk (Eds.), English language corpora: Design, analysis and exploitation: Papers from the thirteenth International Conference on English Language Research on Computerized Corpora, Nijmegen 1992 (pp. 57-71). Amsterdam; Atlanta, GA: Rodopi.
Granger, S. (1994). Learner corpus: A revolution in applied linguistics. English Today, 10 (3), 25-29.
Granger, S., & Rayson, P. (1998). Automatic profiling of learner texts. In S. Granger (Ed.), Learner English on computer (pp. 119-131). London: Addison Wesley Longman.
Granger, S., & Tribble, C. (1998). Learner corpus data in the foreign language classroom: Form-focused instruction and data-driven learning. In S. Granger (Ed.), Learner English on computer (pp. 199-211). London: Addison Wesley Longman.
Granger, S., & Tyson, S. (1996). Connector usage in the English essay writing of native and non-native EFL speakers of English. World Englishes, 15 (1), 17-27.
Hairston, M. (1982). The winds of change: Thomas Kuhn and the revolution in the teaching of writing. College Composition and Communication, 33 (1), 76-88.
Hegelheimer, V. (2003). iWRITE [Web application]. available at http://iwrite.engl.iastate.edu/placement/]. Ames, IA: Author.
Hegelheimer, V. (in press). Helping ESL writers through a multimodal, corpus-based, online grammar resource. CALICO Journal.
Hillman, D. C. A., Willis, D. J., & Gunawardena, C. N. (1994). Learner-interface interaction in distance education: An extension of contemporary models and strategies for practitioners. The American Journal of Distance Education, 8 (2), 30-42.
Hinkel, E. (2002). Teaching grammar in writing classes: Tenses and cohesion. In E. Hinkel & S. Fotos (Eds.), New perspectives on grammar teaching in second language classrooms (pp. 181-198). Mahwah, NJ: Lawrence Erlbaum Associates.
Hinkel, E. (2003). Simplicity without elegance: Features of sentences in L1 and L2 academic texts. TESOL Quarterly, 37 (2), 275-301.
Hinkel, E. (2004). Teaching academic ESL writing: Practical techniques in vocabulary and grammar. Mahwah, NJ: Lawrence Erlbaum Associates.
Hyland, K. (2002). Teaching and researching writing. Harlow, Essex: Longman
James, C. (1998). Errors in language learning and use. London: Longman.
Kuo, C.-H., Wible, D., Chen, M.-C., Sung, L.-C., Tsao, N.-L., & Chio, C.-L. (2002). The design of an intelligent web-based interactive language learning lystem. Journal of Educational Computing Research, 27 (3), 229-248.
Long, M. H. (1991). Focus on form: A design feature in language teaching methodology. In K. de Bot, R. Ginsberg, & C. Kramsch (Eds.), Foreign language research in cross-cultural perspective (pp. 39-52). Amsterdam: John Benjamins.
275
Long, M. H. (1996). The role of linguistic environment in second language acquisition. In W. C. Ritchie & T. K. Bhatia, (Eds.), Handbook of second language acquisition (pp. 413-468). San Diego, CA: Academic Press.
Lorenz, G. (1998). Overstatement in advanced learners' writing: Stylistic aspects of adjective intensification. In S. Granger (Ed.), Learner English on computer (pp. 53-66). London: Addison Wesley Longman.
Maddux, C. D. (2002). The web in education: A case of unrealized potential. Computers in the Schools, 19 (1/2), 7-17.
Meunier, F. (2002). The pedagogical value of native and learner corpora in EFL grammar teaching. In S. Granger, J. Hung, & S. Petch-Tyson (Eds.), Computer learner corpora, second language acquisition and foreign language teaching (pp. 119-142). Amsterdam: John Benjamins.
Milton, J. (1998). Exploiting L1 and interlanguage corpora in the design of an electronic language learning and production environment. In S. Granger (Ed.), Learner English on computer (pp. 186-198). London: Addison Wesley Longman.
Milton, J., & Tsang, E. (1993) A corpus-based study of logical connectors in EFL students writing. In R. Pemberton & E. Tsang (Eds.), Studies in lexis (215-246). Hong Kong: HKUST.
Nation, I. S. P. (1990). Teaching and learning vocabulary. New York: Newbury House.
Phinney, M. (1996). Exploring the virtual world: Computers in the second language writing classroom. In M. Pennington (Ed.), The power of CALL (pp. 137-152). Houston, TX: Athelstan.
Pica, T. (1994, September). The language learner's environment as a resource for linguistic input? A review of theory and research. ITL, Review of Applied Linguistics, 105-106, 69-116.
Raimes, A. (1983). Techniques in teaching writing. Oxford: Oxford University Press.
Read, J. (2000). Assessing vocabulary. Cambridge: Cambridge University Press.
Santos, T. (1988). Professors' reactions to the academic writing of non-native speaking students. TESOL Quarterly, 22 (1), 69-90.
Schmidt, R. (1990). The role of consciousness in second language learning. Applied Linguistics, 11, 129-158.
Vann, R., Lorenz, F., & Meyer, D. E. (1991). Error gravity: Response to errors in the written discourse of nonnative speakers of English. In L. Hamp-Lyons (Ed.), Assessing second language writing (pp. 181-196). Norwood, NJ: Ablex.
Vann, R. J., Meyer, D. E., & Lorenz, G. (1984). Error Gravity: A Study of faculty opinion of ESL errors. TESOL Quarterly, 18 (3), 427-440.
Virtanen, T. (1996). Exploiting the international corpus of learner English (ICLE). AFinLAn vuosikirja, 54, 157-166.
Wible, D., Kuo, C.-H., Chien, F.-Y., Liu, A., & Tsao, N.-L. (2001). A web-based EFL writing environment: Integrating information for learners, teachers, and researchers. Computers & Education, 37 (3-4), 297-315.
276
Appendix
Error Codes and Examples used in iWRITE
Code
Numeric Code
Brief description
Example
Paragraph
REP
0204
repetition of words, phrase, or ideas
I'm now experiencing this challenge at this moment.
PRREF
0203
incorrect/unclear pronoun reference
The teacher just sat there doing their own stuffs.
TRANS
0202
transitions and connectors
By the time passing on, he tried to talk to me frequently and eventually we had become friends. During that moment, he was the only friend that I had.
TC
0201
tense consistency
Finally, I join them and we used to smoke in the toilet.
Sentence
WO
0108
word order
No matter how tough is my future, I won't be afraid because I am his daughter.
CS 
0101
comma splice
When I was a young girl, my parents told me that I'm not a lonely man, I lived in society.
MW 
0109
missing words
But all in all it [?] a good rule.
MDO 
0107
incorect or missing direct object
I tried to persuade [?] not to smoke in school but they just ignored me.
MRP
0106
incorrect or missing relative pronoun
I walk through the campus and get into the building seeking someone [?] could help me.
SV 
0105
S-V agreement
My parents wants the best out of me.
PS
0104
parallel structure
Therefore, he had tried to influence me and modified the concept of my life.
FRAG 
0103
fragment
From that moment.
RUNON 
0102
run-on
I like her advice and use her advice so I'm very healthy and I have a very good life now.
SENT 
0110
embedded sentence problem
When I was a child, my parents always told me that not to play basketball.
277
Word
PLURAL
0306
plural/singular confusion
So with my eye wet, I went to sleep.
POS
0305
part of speech error
Anyway, my mother always advice me not to waste food.
VBUSE
0307
verb usage
So, we all allow to play a game.
VBFORM
0308
verb form
Have you ever think of being a parent?
CHOICE
0309
word choice
I know the truth and I may throw their advice.
COUNT
0311
countable/uncountable noun confusion
When I was still a child, my parents used to give me a lot of advices.
SPELL
0301
misspelling
I realy appriciate my parents' advice.
Determiner
DET
0403
wrong article
He is a optimistic person.
DET 
0404
unnecessary article
He brought a gambling cards.
DET 
0402
missing indefinite article
For example, I had [?] experience before.
DET 
0401
missing definite article
Now, he is running a very good restaurant in [?] local community.
Misc
PREP
0503
preposition selection
She saw us lining up at the corridor to receive our punishment.
EXP
0504
idiomatic expression
People who study smart in the exam will get flying color result.
UNCLEAR
0505
unclear meaning, ambiguous
I think it's a very good method in one's growed way.
PREP 
0506
unnecessary preposition
I have listen to this sentence for hundreds of times since I was a child.
PREP 
0507
missing preposition
She always works from early morning until late [?] night.
PHVRB
0502
phrasal verb
Finally they were caught by the on-duty staff and kick off from school.
278
ACKNOWLEDGMENTS
We would like to thank Carol Chapelle for her insightful comments and suggestions on earlier versions of this manuscript and the anonymous reviewers for CALICO Journal for their concrete recommendations. The Corpus section is provided as a resource for learners that allows them to search for occurrences of words as used by NS. The search queries the Brown corpus using the application program interface (API) provided for interfacing with a concordance application written and provided by Chris Greaves. Parts of iWRITE were developed as part of a research project funded by a College of Liberal Arts and Sciences Faculty Development Grant at Iowa State University.
AUTHORS' BIODATA
Volker Hegelheimer is currently Assistant Professor in the Department of English and the M.A. Program in Teaching English as a Second Language/Applied Linguistics at Iowa State University. He teaches graduate courses on technology in language teaching and research and undergraduate and graduate courses in English as a Second Language. His research interests include applications of the WWW and emerging technologies in language learning and language testing. His publications have appeared in journals such as Language Testing, System, ReCALL, and Language Learning & Technology. He is the author of iWRITE.
David Fisher is a Ph.D. student in Rhetoric and Professional Communication at Iowa State University. He has worked for several years in the software-development industry as a designer, writer, trainer, tester, analyst, and project manager. His research interests include situated learning, school-workplace transitions, and instructional design. He is the chief programmer and designer of iWRITE.
AUTHORS' ADDRESSES
Volker Hegelheimer
Iowa State University
Department of English
341 Ross Hall
Ames, IA 50011
Phone: 515/294-2282
Email: volkerh@iastate.edu
David Fisher
Iowa State University
Department of English
451 Ross Hall
Ames, IA 50011
Phone: 515/294-2180
Email: ddfishe@iastate.edu
279

The Computer Assisted Language Instruction Consortium
Texas State University
214 Centennial Hall
San Marcos,TX 78666
info@calico.org tel. 512-245-1417 fax 512-245-9089
©1996-2006 CALICO, Computer Assisted Language Instruction Consortium

Tidak ada komentar:

Posting Komentar