Description
Abstract: Tensors are a fundamental data structure in many scientific contexts, such as time series analysis (signature path tensors) and materials science (stress-strain tensor) among many others. Improving our ability to produce and handle these tensors is essential to efficiently solve these problems. In this talk, we show how exploiting the symmetries in these problems of the underlying functions mapping tensors to tensors. More concretely, we show equivariant machine learning architectures on tensors that exploit the fact that, in many cases, these tensor-functions are equivariant functions with respect to the diagonal action of the orthogonal, Lorentz, and/or symplectic groups on tensors. Finally, we demonstrate our results on three problems coming from time series analysis, material science and theoretical computer science. Our numerical experiments demonstrate that our equivariant models perform better than corresponding non-equivariant baselines.
Speaker: Josue Tonelli-Cueto from Johns Hopkins University
Location
Computer Science Large Conference Room NPB 3.108A
Category:
Campus Events Students