EdTechforDHH

From PublicWiki
Jump to: navigation, search

[1] Gwen C. Nugent. Deaf Students' Learning From Captioned Instruction: the Relationship Between the Visual and Caption Display. The Journal of Special Education, Vol. 17, No. 2, 227-234 (1983).

Comments: Visuals + captions is better for everyone (both deaf and hearing students) than visual alone or captions alone.

[2] David Hayden, Dirk Colbry, John A. Black, Jr., Sethuraman Panchanathan Note-taker: enabling students who are legally blind to take notes in class. ASSETS ’08: Proceedings of ACM SIGACCESS conference on Computers and accessibility, pages 81–88, New York, NY, USA, 2008. ACM Press.

Comments: Designed for low vision students, but similar in spirit to ClassInFocus. Infact, it wouldn't be hard to include low vision students in the target population for ClassInFocus.

[3] Lisa Elliot (also with Pam Francis) Adapting Tablet PC Technology as a Support for Students who are Deaf or Hard of Hearing. An International Symposium on Technology and Deaf Education, NTID, June 23-25, 2008.

Comments: Using captioning text placement and moving it around with digital ink allowing student to take their own notes and see more realtime notetaking process.

[4] Remote C-Print Captioning in the Educational Environment

[5] Lifelinks Tutors Foundation A non-profit corporation, is offering to provide free on-line video tutoring services to assist deaf children with their homework. This is the first time that any company has offered to provide free tutoring services on a "one-on-one" basis through a direct connection between the deaf student and the tutor. No interpreter is involved. Most of the tutors are deaf and teach in schools for the deaf during the daytime. Most also have advanced degrees in education and certification form universities for the deaf such as the Rochester Institute for the Deaf and from Gallaudet University. The deaf student sees a photo of the available tutors arranged by subject: history, English, math, science, social studies, etc. The student merely clicks on the available tutor and is connected within 10 seconds.

[6] Linda Burik. Active Learning Through Technology: Creating a Technology-Infused Environment to Actively Engage Deaf Students in the Learning Process. Instructional Technology and Education of the Deaf Symposium, NTID, June 23-27, 2003

Comments: Networking in the classroom is occurring already, in both dhh and hearing classrooms. Uses wireless laptops and a smartboard. Teacher can display students screens (sort of like classroom presenter, but it didn't sound like students are "submitting"). Students working along now have digital copies of notes and engaging in class at the same time.

[7] Richard Kheir, Thomas Way. Inclusion of deaf students in computer science classes using real-time speech transcription. SIGCSE conference on Innovation and technology in computer science education (ITiCSE'07) 261 - 265.

Comments: another speech to text paper (can I just list them, is there a survey out there??).

[8] Marc Marschark, Jeff B. Pelz, Carol Convertino, Patricia Sapere, Mary Ellen Arndt, Rosemarie Seewagen. Classroom Interpreting and Visual Information Processing in Mainstream Education for Deaf Students: Live or Memorex[R]? American Educational Research Journal, v42 n4 p727-761 Win 2005

Comments: Video-based interpreting appears to be just as effective as in-person interpreting. Eye-tracking results show that skilled deaf signers spend more time looking at the interpreter than do novice signers. Hearing peers spend more time looking at the display than either skilled signers or novice signers.

[9] F Dowaliby and H Lang. Adjunct aids in instructional prose: a multimedia study with deaf college students. Journal of Deaf Studies and Deaf Education, Vol 4, 270-282.

Comments: tested effects on learning of (1) text only, (2) text and content movies, (3) text and sign movies, (4) text and adjunct questions, and (5) all of these together (full condition). Nothing matters more than student participation (questions).

[10] J. Schull. An extensible, scalable browser-based architecture for synchronous and asynchronous communication and collaboration systems for deaf and hearing individuals. In Assets ’06: Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility, pages 285–286, New York, NY, USA, 2006. ACM Press.

[11] Donald H. Beil. Tablet PC – The New New Thing – Demonstration, and Implications in Deaf Education. Instructional Technology and Education of the Deaf Symposium, NTID, June 23-27, 2003

[12] Western Pennsylvania School for the Deaf boosts visual learning with SMART Board interactive whiteboards

[13] D. Miller, J. Culp, and D. Stotts. Facetop tablet :: note-taking assistance for deaf persons. In Assets ’06: Proceedings of the ACM SIGACCESS conference on Computers and accessibility, 247–248, New York, NY, USA, 2006. ACM Press. another ref for same project: [14]

[15] Marc Marschark, Greg Leigh, Patricia Sapere, Denis Burnham, Carol Convertino, Michael Stinson, Harry Knoors, Mathijs P. J. Vervloed, William Noble. Benefits of Sign Language Interpreting and Text Alternatives for Deaf Students' Classroom Learning. Journal of Deaf Studies and Deaf Education 11(4):421-437, 2006.

Comments: Paper presents several experiments comparing many different types of accommodation for deaf students (sign language instruction, sign language interpretation, real-time text with both CART (stenographer) and C-Print (trained speech-to-text captioner), and both C-Print and sign language). Mixed results are not easy to parse but there are a few finding relevant to us. Real time text may be better than sign language interpretation for courses involving many new vocab terms. One study indicates that too much accommodation can be a bad thing: both real time text alone and sign language alone were better received than both together (perhaps too much to visual attend to resulting in info loss). In contrast, another study found that having both sources of accommodation to be beneficial and that study showed both on the same computer screen. Student's perceived comprehension did not match actual comprehension. Students learned more from sign language during class but got more out of real time text notes than video of interpreter for studying.

[16] Lisa B. Elliot, Michael S. Stinson, Barbara G. McKee, Victoria S. Everhart and Pamela J. Francis. College Students' Perceptions of the C-Print Speech-to-Text Transcription System. Journal of Deaf Studies and Deaf Education 6:4, pg 285-298. 2001.

Comments: "Looking back and forth I miss what is happening sometimes actually what is going on with the interpreter. But the information is wonderful on C-Print." Both C-Print and PepNet are good examples of projects that have networked many universities and resources to greater a bigger support system for students. United States Department of Education, through both the C-Print National Network Training Grant and the Postsecondary Education Programs Network (PEPNet). C-Print captionists are trained in ... and captions can be used as notes later (sometimes captionists will edit to create clearer notes). "C-Print works best in lecturebased courses and courses that rely more on words as opposed to formulas or graphics."

[17] S. Bennett, J. Hewitt, D. Kraithman, C. Britton. Making chalk and talk accessible. ACM Conference on Universal Usability 2003, Vancouver, British Columbia, Canada, 119-125.

Comment: another speech rec for real time text captions

[18] R. Kheir and T. Way. Inclusion of deaf students in computer science classes using real-time speech transcription. SIGCSE Bull., 39(3):261–265, 2007.

Comment: yet another speech rec for captions. These same folk have something similar in ASSETS 2007.

[19] Jill E. Preminger, Harry Levitt. Computer-Assisted Remote Transcription (CART): A Tool To Aid People Who Are Deaf or Hard of Hearing in the Workplace. Volta Review, v99 n4 p219-30 Sum 1997

Comment: appears to be the best ref for CART. Also of interest with CART: the student can highlight portions of the text and add their own comments as the realtime scrolls across the computer monitor. Another ref: [20]

[21] Johnny Carroll and Kevin McLaughlin. Closed captioning in distance education. ACM Journal of Computing Sciences in Colleges (April 2005), Volume 20, Issue 4, 183-189.

[22] Jan Richards, Deborah Fels, Jim Hardman. The Educational Potential of the Signing Web. Instructional Technology and Education of the Deaf Symposium, NTID, June 27-30, 2005.

Comment: Same people as Sign Link Studio. ASL Video can be edited so that certain parts link to other videos. Idea is to create of web of video the way the internet is a web of text and images.

      • Signing Avatars for Educational Purposes:

[23] [24] [25] Judy Vesel. Signing Science! Learning & Leading with Technology ISTE (International Society for Technology in Education) May 2005, Volume 32 Number 8, pg 30-35.

[26] Ron Cole, Dominic W. Massaro, Jacques de Villiers, Brian Rundle, Khaldoun Shobaki, Johan Wouters, Michael Cohen, Jonas Beskow, Patrick Stone, Pamela Connors, Alice Tarachow, Daniel Solcher. New tools for interactive speech and language training: Using animated conversational agents in the classrooms of profoundly deaf children.

      • Less relevant stuff

[27] Kathleen F. McCoy and Lisa N. Masterman (Michaud). 1997. A tutor for teaching English as a second language for deaf users of American Sign Language. In Proceedings of Natural Language Processing for Communication Aids, an ACL/EACL97 Workshop, pages 160--164, Madrid, Spain, July.