| MS TSA Meeting


Name of the Speaker: Reddy Joshna Manoj (EE21S113)
Guide: Dr. Kaushik Mitra
Online meeting link: https://meet.google.com/nmg-epku-vfh
Date/Time: 16th May 2024 (Thursday), 11:00 AM
Title: Near-Field Neural Rendering Guided by Single-Shot Photometric Stereo (Application : 3D endoscopy)

Abstract :

We present a novel near-field neural rendering approach that combines single-shot RGB photometric stereo and Signed Distance Functions(SDF)-based Neural Rendering. Photometric stereo-based cues guide the neural rendering of 3D meshes. Recent studies have shown that SDF-based neural implicit surface reconstruction methods can produce smoother, more comprehensive 3D reconstructions. However, their performance tends to decline when capturing fine details in complex near-field images due to the ambiguity in RGB reconstruction loss. For instance, 3D endoscopy, characterized by near-field sparse views and complex surfaces, poses considerable challenges for existing volumetric SDF methods. Motivated by advancements in near-field photometric stereo, our work explores the utilization of these cues to enhance neural implicit surface reconstruction from diverse perspectives. To simplify the acquisition of photometric stereo images in endoscopy setups, we employ a strategy that involves single-shot RGB photometric stereo capture for each view. Utilizing a learning-based near-field photometric stereo network, we extract depth and normals for each view. These cues contribute to improved performance when dealing with very near-field natural objects as well as endoscopic scenes. We have extensively tested this approach on various rendered and real near-field datasets, and it consistently outperforms existing photometric NeRF techniques, especially when dealing with near-field multi-view visuals.

Our approach simplifies the capture process while maintaining high-quality results, crucial for near-field objects requiring precise detail.In our experiments, We seamlessly capture multiple viewpoints of various near-field real-time objects using a mobile device. Additionally, within our lab, we harness RGB-based rigid endoscopy setups to gather diverse perspectives. Through this approach, we've made remarkable strides in surface reconstruction, notably in intricate near-field scenarios like 3D endoscopy.