{"id":2100,"date":"2024-11-22T18:05:22","date_gmt":"2024-11-22T21:05:22","guid":{"rendered":"https:\/\/web.inf.ufpr.br\/vri\/?page_id=2100"},"modified":"2024-11-22T18:05:22","modified_gmt":"2024-11-22T21:05:22","slug":"ufpr-closeup","status":"publish","type":"page","link":"https:\/\/web.inf.ufpr.br\/vri\/databases\/ufpr-closeup\/","title":{"rendered":"UFPR-Close-Up Dataset"},"content":{"rendered":"\n<figure class=\"wp-block-image size-large\"><img fetchpriority=\"high\" decoding=\"async\" width=\"1024\" height=\"423\" src=\"https:\/\/web.inf.ufpr.br\/vri\/wp-content\/uploads\/sites\/7\/2024\/11\/UFPR-Close-Up_movement-1024x423.png\" alt=\"\" class=\"wp-image-2101\" srcset=\"https:\/\/web.inf.ufpr.br\/vri\/wp-content\/uploads\/sites\/7\/2024\/11\/UFPR-Close-Up_movement-1024x423.png 1024w, https:\/\/web.inf.ufpr.br\/vri\/wp-content\/uploads\/sites\/7\/2024\/11\/UFPR-Close-Up_movement-300x124.png 300w, https:\/\/web.inf.ufpr.br\/vri\/wp-content\/uploads\/sites\/7\/2024\/11\/UFPR-Close-Up_movement-768x317.png 768w, https:\/\/web.inf.ufpr.br\/vri\/wp-content\/uploads\/sites\/7\/2024\/11\/UFPR-Close-Up_movement-360x149.png 360w, https:\/\/web.inf.ufpr.br\/vri\/wp-content\/uploads\/sites\/7\/2024\/11\/UFPR-Close-Up_movement.png 1439w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>The UFPR-Close-Up dataset has videos with the active interaction named close-up movement, composed of two steps. In the initial phase, the user must move away from the device to position their face within a small area highlighted on the screen. Once properly aligned and held in place for at least one second, the second phase is initiated. In this phase, the user is prompted to position their face within a larger area displayed on the screen, requiring them to move closer to the device to align their face again and hold the position for one second. The face detector continuously monitors the face&#8217;s position; if the alignment is lost, the app reverts to the previous step, requiring the user to realign their face within the indicated area.<\/p>\n\n\n\n<p>The dataset has 2,561 videos, where 714 of them are genuine samples from volunteer subjects and 1,847 are spoof samples made using selected face images of the CelebA dataset and CelebV videos, displayed in several presentation attack instruments, covering the most common categories of attacks. It has been introduced in our paper&nbsp;<strong><a href=\"\" target=\"_blank\" rel=\"noreferrer noopener\">[PDF]<\/a><\/strong>.<\/p>\n\n\n\n<p>Live samples were captured by the participant with a minimum interval of 12 hours between sessions and using their own smartphone through a mobile application (app) developed by us. Whilst spoof samples were recorded using devices belonging to the Android family (Xiaomi Redmi Note 13, Moto G54, Samsung Galaxy S22FE, Samsung Galaxy S23, Samsung Galaxy A54, Samsung Galaxy A34) and to the iOS family (iPhone 8, iPhone 12, iPhone 14) with a modified version of the app that allows label the spoof attack type and used instrument while recording samples.<\/p>\n\n\n\n<p>In total, the dataset has 382 different live subjects and 1043 spoof targets. Spoof targets were selected based on gender and pose (yaw and pitch angles) distributions of live samples. Spoof samples are composed of four different presentation attacks: Photo, display, Replay, Scaling and Mask. Each with two different presentation attack instrument.<\/p>\n\n\n\n<p>Four different protocols were proposed to evaluate the robustness of active presentation attack detectors in challenging scenarios, each with a different small domain-shift such as against under unseen presentation attack instrument, unknown presnetation attacks and different camera noises.<\/p>\n\n\n\n<p>Infos about videos\u2019 distributions by presentation attack, protocol details and benchmark are in our&nbsp;<strong><a href=\"\" target=\"_blank\" rel=\"noreferrer noopener\">paper<\/a><\/strong>.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">How to obtain the Dataset<\/h3>\n\n\n\n<p>Due to private financial support for the creation of this dataset and in compliance with data protection agreements, all spoof samples are currently available, but live samples will be released starting in April 2027<\/p>\n\n\n\n<p>The UFPR-Close-Up dataset is released&nbsp;<strong>only<\/strong>&nbsp;to academic researchers from educational or research institutes for&nbsp;<strong>non-commercial purposes<\/strong>.&nbsp;<\/p>\n\n\n\n<p>To be able to download the dataset, please read carefully&nbsp;<strong><a href=\"\" target=\"_blank\" rel=\"noreferrer noopener\">this license agreement<\/a><\/strong>, fill it out and send it back to Professor David Menotti (<a rel=\"noreferrer noopener\" href=\"mailto:menotti@inf.ufpr.br\" target=\"_blank\">menotti@inf.ufpr.br<\/a>). The license agreement MUST be reviewed and signed by the individual or entity authorized to make legal commitments on behalf of the institution or corporation (e.g., Department\/Administrative Head, or similar).&nbsp;<strong>We cannot accept licenses signed by students or faculty members.<\/strong><\/p>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>References<\/strong><\/h3>\n\n\n\n<p>If you use the UFPR-Close-Up dataset in your research please cite our paper:<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li>BibTeX<\/li>\n<\/ul>\n\n\n\n<h3 class=\"wp-block-heading\"><strong>Contact<\/strong><\/h3>\n\n\n\n<p>Please contact Bruno H. Kamarowski (<a href=\"mailto:bhkc18@inf.ufpr.br\" target=\"_blank\" rel=\"noreferrer noopener\">bhkc18@inf.ufpr.br<\/a>) with questions or comments.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The UFPR-Close-Up dataset has videos with the active interaction named close-up movement, composed of two steps. In the initial phase, the user must move away from the device to position their face within a small area highlighted on the screen. <a href=\"https:\/\/web.inf.ufpr.br\/vri\/databases\/ufpr-closeup\/\" class=\"read-more\">Read More &#8230;<\/a><\/p>\n","protected":false},"author":16,"featured_media":0,"parent":16,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-2100","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/2100","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/users\/16"}],"replies":[{"embeddable":true,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/comments?post=2100"}],"version-history":[{"count":1,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/2100\/revisions"}],"predecessor-version":[{"id":2102,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/2100\/revisions\/2102"}],"up":[{"embeddable":true,"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/pages\/16"}],"wp:attachment":[{"href":"https:\/\/web.inf.ufpr.br\/vri\/wp-json\/wp\/v2\/media?parent=2100"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}