KEYWORDS: 3D modeling, Finite element methods, Motion models, Motion estimation, Data modeling, 3D image processing, Visual process modeling, Motion analysis, Machine vision, Computer vision technology
In this paper, we address the problem of 3d motion and structure estimation of 3d objects appearing in 2d images; namely, the estimation of the parameters of 3d motion, such as rotation and translation in 3d space, and the third dimension of certain objects that can be seen in a time sequence of 2d images. The study of the problem will be restricted to the case of non-rigid objects. The main problem with non-rigid motion is that it is not structured in a way that would allow its estimation with a few parameters, as in the case of rigid motion. That is why the main effort in most of the approaches is in representing deformable objects with suitable models that require only a small number of parameters for their description. In this work we perform a survey of algorithms proposed for the 3d motion and structure estimation of non-rigid objects, using the Finite Element Method. A classification of established algorithms is presented, with respect to the actual technique utilized and the applications for which they are most suitable.
Research in facial expression has concluded that at least six emotions, conveyed by human faces, are universally associated with distinct expressions. Sadness, anger, joy, fear, disgust and surprise are categories of expressions that are recognizable across cultures. In this work we form a relation between the description of the universal expressions and the MPEG-4 Facial Definition Parameter Set (FDP). We also investigate the relation between the movement of basic FDPs and the parameters that describe emotion-related words according to some classical psychological studies. In particular Whissel suggested that emotions are points in a space, which seem to occupy two dimensions: activation and evaluation. We show that some of the MPEG-4 Facial Animation Parameters (FAPs), approximated by the motion of the corresponding FDPs, can be combined by means of a fuzzy rule system to estimate the activation parameter. In this way variations of the six archetypal emotions can be achieved. Moreover, Plutchik concluded that emotion terms are unevenly distributed through the space defined by dimensions like Whissel's; instead they tend to form an approximately circular pattern, called 'emotion wheel,' modeled using an angular measure. The 'emotion wheel' can be defined as a reference for creating intermediate expressions from the universal ones, by interpolating the movement of dominant FDP points between neighboring basic expressions. By exploiting the relation between the movement of the basic FDP point and the activation and angular parameters we can model more emotions than the primary ones and achieve efficient recognition in video sequences.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.