About This Document
- sl:arxiv_author :
- sl:arxiv_firstAuthor : Guillaume Lample
- sl:arxiv_num : 1912.01412
- sl:arxiv_published : 2019-12-02T15:05:24Z
- sl:arxiv_summary : Neural networks have a reputation for being better at solving statistical or
approximate problems than at performing calculations or working with symbolic
data. In this paper, we show that they can be surprisingly good at more
elaborated tasks in mathematics, such as symbolic integration and solving
differential equations. We propose a syntax for representing mathematical
problems, and methods for generating large datasets that can be used to train
sequence-to-sequence models. We achieve results that outperform commercial
Computer Algebra Systems such as Matlab or Mathematica.@en
- sl:arxiv_title : Deep Learning for Symbolic Mathematics@en
- sl:arxiv_updated : 2019-12-02T15:05:24Z
- sl:bookmarkOf : https://arxiv.org/abs/1912.01412
- sl:creationDate : 2019-12-09
- sl:creationTime : 2019-12-09T17:11:42Z
Documents with similar tags (experimental)