Differentiable Approximations for Distance Queries
IRB 0318 (Gannon) or https://umd.zoom.us/j/97919102992?pwd=LbSBM2MZy4QpVfnj92ukT5AIqyTYaO.1#s...
Many problems in computational geometry involve outputs that vary smoothly and continuously as a function of the input, for example, computing the diameter of a point set. In the context of geometric query problems, a natural example is the nearest-neighbor distance problem: preprocess a point set P so that, given any query point q, the distance to the closest point to q in P is returned. Clearly, closest distances vary continuously as a function of q.
In many real-world applications, particularly those that arising in deep-learning, it is required that an algorithm's output varies continuously as a function of its inputs. Learning agents often require that the function return both a value and its gradient. While this is a reasonable assumption for many real-valued exact geometry algorithms, it fails for virtually all approximation algorithms. In most approximation algorithms, infinitesimal changes to the input can result in sudden changes in the output. Unfortunately, many multi-dimensional problems in computational geometry, including nearest-neighbor searching, can only be solved approximately in practice.
In this talk, we will explore the challenges of transforming geometric data structures that return approximations into data structures whose outputs are continuous and differentiable. We will present a general method for transforming common approximation data structures in computational geometry into data structures that provide smooth, differentiable outputs, without sacrificing space or query-time efficiency.