Please use this identifier to cite or link to this item:
http://archives.univ-biskra.dz/handle/123456789/666
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | GHANEM, KHADOUDJA | - |
dc.contributor.author | CAPLIER, ALICE | - |
dc.contributor.author | KHOLLADI, M.K. | - |
dc.date.accessioned | 2013-12-30T18:39:48Z | - |
dc.date.available | 2013-12-30T18:39:48Z | - |
dc.date.issued | 2013-12-30 | - |
dc.identifier.uri | http://archives.univ-biskra.dz/handle/123456789/666 | - |
dc.description.abstract | In this work we report on the progress of building a system that enables the intensity estimation of unknown expression based on a study of the degree of facial permanent features deformations from still images. The facial changes can be identified as facial action units which correspond to the movement of muscles. We analyze subtle changes in facial expression by interpreting the movement of the muscle by its corresponding distances computed from characteristic facial points. All changed distances, are compared with corresponding Thresholds, to be mapped to symbolic states that qualitatively encode how much a given distance differs from its corresponding value in the neutral state. The Transferable Belief Model is used to fuse all data which correspond to the whole of changed distances. Expression intensity is quantified as: High, medium or low. Different raisons are done to prove that is better to estimate expression intensity of unknown expression than of known one. | en_US |
dc.language.iso | en | en_US |
dc.subject | Facial expression | en_US |
dc.subject | expression intensity | en_US |
dc.subject | belief theory | en_US |
dc.title | INTENSITY ESTIMATION OF UNKNOWN EXPRESSION BASED ON A STUDY OF FACIAL PERMANENT FEATURES DEFORMATIONS | en_US |
dc.type | Article | en_US |
Appears in Collections: | CS N 14 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
14-k.ghanem.pdf | 592,95 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.