{"id":1150396,"date":"2025-09-23T07:46:07","date_gmt":"2025-09-23T14:46:07","guid":{"rendered":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/?post_type=msr-research-item&#038;p=1150396"},"modified":"2025-09-25T07:47:32","modified_gmt":"2025-09-25T14:47:32","slug":"affective-air-quality-dataset-personal-chemical-emissions-from-emotional-videos","status":"publish","type":"msr-research-item","link":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/publication\/affective-air-quality-dataset-personal-chemical-emissions-from-emotional-videos\/","title":{"rendered":"Affective Air Quality Dataset: Personal Chemical Emissions from Emotional Videos"},"content":{"rendered":"<p>Inspired by the role of chemosignals in conveying emotional states, this paper introduces the Affective Air Quality (AAQ) dataset, a novel dataset collected to explore the potential of volatile odor compound and gas sensor data for non-contact emotion detection. This dataset bridges the gap between the realms of breath & body odor emission (personal chemical emissions) analysis and established practices in affective computing. Comprising 4-channel gas sensor data from 23 participants at two distances from the body (wearable and desktop), alongside emotional ratings elicited by targeted movie clips, the dataset encapsulates initial groundwork to analyze the correlation between personal chemical emissions and varied emotional responses. The AAQ dataset also provides insights drawn from exit interviews, thereby painting a holistic picture of perceptions regarding air quality monitoring and its implications for privacy. By offering this dataset alongside preliminary attempts at emotion recognition models based on it to the broader research community, we seek to advance the development of odor-based affect recognition models that prioritize user privacy and comfort.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Inspired by the role of chemosignals in conveying emotional states, this paper introduces the Affective Air Quality (AAQ) dataset, a novel dataset collected to explore the potential of volatile odor compound and gas sensor data for non-contact emotion detection. This dataset bridges the gap between the realms of breath & body odor emission (personal chemical [&hellip;]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr-author-ordering":null,"msr_publishername":"","msr_publisher_other":"","msr_booktitle":"","msr_chapter":"","msr_edition":"","msr_editors":"","msr_how_published":"","msr_isbn":"","msr_issue":"","msr_journal":"","msr_number":"","msr_organization":"","msr_pages_string":"","msr_page_range_start":"","msr_page_range_end":"","msr_series":"","msr_volume":"","msr_copyright":"","msr_conference_name":"","msr_doi":"","msr_arxiv_id":"","msr_s2_paper_id":"","msr_mag_id":"","msr_pubmed_id":"","msr_other_authors":"","msr_other_contributors":"","msr_speaker":"","msr_award":"","msr_affiliation":"","msr_institution":"","msr_host":"","msr_version":"","msr_duration":"","msr_original_fields_of_study":"","msr_release_tracker_id":"","msr_s2_match_type":"","msr_citation_count_updated":"","msr_published_date":"2025-9-1","msr_highlight_text":"","msr_notes":"","msr_longbiography":"","msr_publicationurl":"","msr_external_url":"","msr_secondary_video_url":"","msr_conference_url":"","msr_journal_url":"","msr_s2_pdf_url":"","msr_year":0,"msr_citation_count":0,"msr_influential_citations":0,"msr_reference_count":0,"msr_s2_match_confidence":0,"msr_microsoftintellectualproperty":true,"msr_s2_open_access":false,"msr_s2_author_ids":[],"msr_pub_ids":[],"msr_hide_image_in_river":null,"footnotes":""},"msr-research-highlight":[],"research-area":[13554],"msr-publication-type":[193726],"msr-publisher":[],"msr-focus-area":[],"msr-locale":[268875],"msr-post-option":[269148,269142,269145],"msr-field-of-study":[254431,255217,247978],"msr-conference":[],"msr-journal":[],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-1150396","msr-research-item","type-msr-research-item","status-publish","hentry","msr-research-area-human-computer-interaction","msr-locale-en_us","msr-post-option-approved-for-river","msr-post-option-include-in-river","msr-post-option-pinned-for-river","msr-field-of-study-affective-computing","msr-field-of-study-human-computer-interaction-4","msr-field-of-study-wearable-computer"],"msr_publishername":"","msr_edition":"","msr_affiliation":"","msr_published_date":"2025-9-1","msr_host":"","msr_duration":"","msr_version":"","msr_speaker":"","msr_other_contributors":"","msr_booktitle":"","msr_pages_string":"","msr_chapter":"","msr_isbn":"","msr_journal":"","msr_volume":"","msr_number":"","msr_editors":"","msr_series":"","msr_issue":"","msr_organization":"","msr_how_published":"","msr_notes":"","msr_highlight_text":"","msr_release_tracker_id":"","msr_original_fields_of_study":"","msr_download_urls":"","msr_external_url":"","msr_secondary_video_url":"","msr_longbiography":"","msr_microsoftintellectualproperty":1,"msr_main_download":"","msr_publicationurl":"","msr_doi":"","msr_publication_uploader":[{"type":"url","viewUrl":"false","id":"false","title":"https:\/\/arxiv.org\/abs\/2509.15774","label_id":"252679","label":0}],"msr_related_uploader":[{"type":"url","viewUrl":"false","id":"false","title":"https:\/\/github.com\/microsoft\/aaq","label_id":"264520","label":0}],"msr_citation_count":0,"msr_citation_count_updated":"","msr_s2_paper_id":"","msr_influential_citations":0,"msr_reference_count":0,"msr_arxiv_id":"","msr_s2_author_ids":[],"msr_s2_open_access":false,"msr_s2_pdf_url":null,"msr_attachments":[],"msr-author-ordering":[{"type":"text","value":"Jas Brooks","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Javier Hernandez","user_id":38413,"rest_url":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Javier Hernandez"},{"type":"text","value":"Mary Czerwinski","user_id":0,"rest_url":false},{"type":"user_nicename","value":"Judith Amores","user_id":42003,"rest_url":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/microsoft-research\/v1\/researchers?person=Judith Amores"}],"msr_impact_theme":[],"msr_research_lab":[199563,199565],"msr_event":[],"msr_group":[578422,1084857,1105932],"msr_project":[1150522],"publication":[],"video":[],"msr-tool":[1150495],"msr_publication_type":"unpublished","related_content":{"projects":[{"ID":1150522,"post_title":"Digital Empathy for Everyday AI","post_name":"digital-empathy-for-everyday-ai","post_type":"msr-project","post_date":"2025-09-30 06:43:13","post_modified":"2025-12-10 14:36:01","post_status":"publish","permalink":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/project\/digital-empathy-for-everyday-ai\/","post_excerpt":"Designing empathic interactions for helpful, human-centered AI As AI agents become woven into everyday life, from education and healthcare to customer support and productivity tools, we face a basic question. How should these systems understand and respond to us. We see digital empathy as an increasingly relevant idea driving the next wave of AI agent optimization. However, empathy is traditionally a human capacity that focuses on understanding and feeling another person\u2019s experience from within their&hellip;","_links":{"self":[{"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/1150522"}]}}]},"_links":{"self":[{"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/1150396","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item"}],"about":[{"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-research-item"}],"version-history":[{"count":2,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/1150396\/revisions"}],"predecessor-version":[{"id":1150398,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-research-item\/1150396\/revisions\/1150398"}],"wp:attachment":[{"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/media?parent=1150396"}],"wp:term":[{"taxonomy":"msr-research-highlight","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-research-highlight?post=1150396"},{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=1150396"},{"taxonomy":"msr-publication-type","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-publication-type?post=1150396"},{"taxonomy":"msr-publisher","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-publisher?post=1150396"},{"taxonomy":"msr-focus-area","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-focus-area?post=1150396"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=1150396"},{"taxonomy":"msr-post-option","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-post-option?post=1150396"},{"taxonomy":"msr-field-of-study","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-field-of-study?post=1150396"},{"taxonomy":"msr-conference","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-conference?post=1150396"},{"taxonomy":"msr-journal","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-journal?post=1150396"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=1150396"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/newed.any0.dpdns.org\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=1150396"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}