UNIST Unveils Smart Contact Lens with Meniscus Pixel Printing for Vision-Based Robotic Control
The findings of this research have been published in Advanced Functional Materials on March 11, 2026.
Abstract Contact lenses are emerging as strong candidates for next-generation extended reality (XR) interfaces due to their lightweight and ergonomic form factor. However, integrating photodetector arrays onto the limited area of a lens remains challenging with conventional micropatterning approaches, which rely on masks, multistep processes, and specialized equipment that inherently limit throughput and scalability. To address these constraints, we introduce a Meniscus Pixel Printing (MPP) strategy that enables rapid, mask-free patterning of MAPbI3 perovskite photodetectors without costly or complex fabrication tools. MPP uses a self-confined meniscus at a pipette tip to deterministically transfer perovskite ink, enabling 200 µm pixels to be printed within 1 s per pixel. In addition to planar substrates, MPP demonstrates stable pixel patterning on curved surfaces, highlighting its geometric adaptability and process versatility. Using this approach, we fabricate a 10 × 10 perovskite photodetector array and demonstrate stable photoresponse, retaining 92% of its initial performance after two months of storage. To overcome limited pixel density, a deep-learning-based super-resolution (SR) model reconstructs 10 × 10 inputs into 80 × 80 optical information with 97.2% accuracy and 0.03 s latency. Additionally, an AI-based eye-tracking system recognizes nine eye gestures with 99.3% accuracy, enabling smooth hands-free robotic arm control. A research team, led by Professor Im Doo Jung from the Department of Mechanical Engineering at UNIST, has developed a groundbreaking smart contact lens that enables users to control robots through eye movements. This innovative device combines embedded optical sensors with AI-based signal processing, offering a lightweight, intuitive human-machine interface with vast potential across industries. The lens incorporates a 10×10 array of light sensors capable of detecting subtle changes in light distribution caused by eye movements, including gaze direction and blinks. These signals are transmitted to control external robotic systems, as demonstrated with a robotic arm. Notably, the team employed a novel Meniscus Pixel Printing (MPP) technique to directly print sensors onto the curved lens surface without masks or complex fabrication steps, ensuring high precision and customizability. In addition to robotic control, the system demonstrates vision sensing capabilities by reconstructing optical information. To address the limited signal resolution inherent to micro-scale devices, the researchers applied deep-learning-based super-resolution algorithms, reconstructing high-fidelity signals equivalent to an 80x80 sensor array in just 0.03 seconds. This enables real-time, accurate control based solely on eye movements, achieving recognition accuracies of up to 99.3% under experimental conditions. This technology marks a significant advancement in ultra-compact human-machine interfaces, enabling precise, hands-free control of electronic devices. Potential applications include remote robotic operation, medical assistive devices, exploration in hazardous environments, defense systems, and smart mobility. Published in the March 2026 issue of Advanced Functional Materials (Impact Factor: 19.0, JCR Top 5%)—a top-tier journal in materials science—the research was selected as the Front Cover of the latest issue. The study received support from the National Research Foundation of Korea (NRF), the Ministry of Science and ICT (MSIT), the Institute of Information & Communications Technology Planning & Evaluation (IITP), and the Ministry of Trade, Industry, and Energy (MOTIE). Journal Reference Byung-Hoon Gong, Dohyean Kim, Jiyun Jeong, et al ., “Meniscus Pixel Printing for Contact-Lens Vision Sensing and Robotic Control,” Adv. Funct. Mater. , (2026).
- 2026-04-23
- JooHyeon Heo
- 121