Advancements in AI have revolutionized how machines perceive and interact with the world. However, almost all current progress has been limited to the text, visual, and auditory modalities. Our group is revolutionizing AI to perceive smell like humans can. Smell is crucial in environmental perception, social interaction, and regulating well-being. Similarly, AI that can recognize smell can enhance entertainment, gaming, and marketing and for quality control in the chemical, food and beverage, and manufacturing industries. More ambitiously, smell-based diagnostics can help in early disease detection, (e.g., COVID-19), and even ‘smelling’ hormones and indicators of emotional states, stress, and early prognosis of cancer. Undoubtedly, fundamental advances in AI for smell can immensely impact the world.
Of course, smell is a completely new data modality for AI, with virtually no progress compared to vision and language. We believe that large-scale data and AI models are key for learning rich feature representations of smell for sensing, transmission, and fusion with other human senses. This strategy differs fundamentally from past research, which has emphasized human domain knowledge, feature engineering, small datasets, and simple feature-based classification models rather than large-scale data-driven learning. Our research applies portable gas sensors to create a large-scale database of how food, beverages, and natural objects ‘smell’, develop the AI models to learn and classify smell data, and apply these systems to real-time food and beverage classification with a case study on allergen detection (e.g., ‘smelling’ gluten or peanuts in a cake). This approach will be transformative since it leverages the scale of big data rather than domain engineering, resulting in systems that truly work under real-world diversity.