CALL FOR PAPERS
                      Computer Vision and Image Understanding
                    Special Issue on Eye Detection and Tracking
 Guest Editors:
    Qiang Ji, Rensselaer Polytechnic Institute, qji@ecse.rpi.edu
    Harry Wechsler, George Mason University, wechsler@cs.gmu.edu
    Andrew Duchowski, Clemson University, andrewd@vr.clemson.edu
    Myron Flickner, IBM Almaden Research, flick@almaden.ibm.com
 
 AIMS AND SCOPE
 As one of the most salient features of human face, eyes play an important
 role in interpreting and understanding a person's desires, needs, and 
emotional
 states.  Robust non-intrusive eye detection and tracking is, therefore,
 crucial for human computer interaction, attentive user interfaces, and
 understanding human affective states.  In addition, the unique geometric,
 photometric, and motion characteristics of the eyes also provide important
 visual cues for face detection, face recognition, and for facial expression
 understanding.
 
 There has been much work in eye detection and tracking. The existing work
 can be broadly classified into two categories: traditional image based
 passive approaches and the active IR based approaches. The former approaches
 can be further divided into appearance-based, model-based, feature-based,
 and motion-based methods. The latter approach exploits the spectral properties
 of pupil under near IR illumination.  Eye tracking is accomplished by tracking
 the bright (dark) pupils.  Recently, there are efforts in combining the
 passive approach with the active approach to produce more robust eye tracking.
 
 Despite these efforts, robust, accurate, and non-intrusive eye detection and
 tracking remains largely an unsolved issue. The challenges result from eye
 closure, eye occlusion, variability in scale and face orientation, and
 different lighting conditions. The many commercial eye trackers tend to be
 intrusive and restrictive. Furthermore, they often require a cumbersome
 calibration process. On the other hand, the existing non-intrusive eye 
tracking techniques lack robustness and accuracy which limit their use in 
practice.
 
 The special issue solicits original research that focuses on the following
 aspects of eye detection and gaze estimation
 *  non-intrusive eye detection and tracking
 *  non-intrusive eye gaze estimation requiring minimal or no user calibration
 *  eye gestures and activities detection and characterization
 *  applications of eye detection and tracking techniques
 *  comprehensive review/survey of the existing technologies in eye and gaze
    detection and tracking
 
 SUBMISSION GUIDELINES
 Only original, high-quality papers -- in-line with the CVIU guidelines --
 will be considered for publication in this special issue.  Prospective authors
 concerned about their paper falling under the scope of this special issue,
 should send an abstract to the guest editors for a preliminary evaluation
 prior to the submission of the full paper.
 
 All papers should be submitted electronically via our http server.  Files
 should be in PDF or PS format.  Authors should also submit a cover letter
 in plain text with the following information: title of the submitted article,
 the name of the file that has been submitted, all authors full names, and,
 the corresponding author's mailing address, day time phone number and e-mail.
 Please, send all cover letters via e-mail to Prof. Qiang Ji at 
qji@ecse.rpi.edu
 
 The URL for our http server is http://cviu.ecse.rpi.edu/cviu/login.php
 Follow the instructions there to submit your paper.
 
 If you use LaTeX, please use elsart style files to format your manuscript.
 The latex styles can be downloaded from http://www.elsevier.com/locate/latex
 
 IMPORTANT DATES:
 * Electronic submission of full manuscripts:           	October 15, 
2003
 * Notification to authors:                              February 15, 2004
 * Submission of revised manuscripts:                    April 15, 2004
 * Final decision on accepted papers:                    May 15, 2004
 * Publication of special issue:                         Third quarter 2004