Please use this identifier to cite or link to this item: https://repository.hneu.edu.ua/handle/123456789/40026
Title: Mutual Information Preference Optimization for Robust Multi- Modal Recipe Generation
Authors: Shaposhnyk M.
Minukhin S.
Keywords: Food Computing
Llama 3.1
DenseNet-121
Mutual Information Preference Optimization
visual grounding
Issue Date: 2026
Citation: Shaposhnyk M. Mutual Information Preference Optimization for Robust Multi- Modal Recipe Generation / M. Shaposhnyk, S. Minukhin // Сучасні інформаційні технології та системи штучного інтелекту MIT&AIS-2026 : матеріали 2-ї Міжнародної науково-практичної конференції, 27-29 квітня 2026 р. Харків – Яремче, Україна. – Харків, 2026. – С. 113-117.
Abstract: This study evaluates the impact of Mutual Information Preference Optimization (MIPO) as a corrective layer within a hybrid vision-language architecture. Rather than introducing a new standalone framework, the research modifies an existing multimodal pipeline by integrating MIPO to bridge the operational gap between a DenseNet-121 ensemble and Llama 3.1 8B. The central hypothesis—that LLMs can act as autonomous semantic filters—was tested through contrastive alignment, which synchronizes CNN-derived visual features with the textual latent space. Experimental results on the Food-101 dataset validate this modification, demonstrating that the system can successfully suppress false-positive detections without a complete retraining of the visual backbone. By filtering out incongruous artifacts through preference optimization, the modified architecture achieved a 60,8% reduction in semantic hallucinations. This confirms the viability of using LLMs for real-time error correction in specialized domains, such as personalized dietetics, where output fidelity is a critical requirement.
URI: https://repository.hneu.edu.ua/handle/123456789/40026
Appears in Collections:Статті (ІС)

Files in This Item:
File Description SizeFormat 
MIT&AIS_2026_main.....pdf1,54 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.