This paper uses graph neural networks to fit into physical system and then uses symbols to recover real-world equations. And even was able to find a new equation.
The abstract does not tell a lot, but it is related to stars and the universe. Such as three planet problem, there is a complex motion from just three stars.
Can we drive the equations that governors the movement of the stars? We need to take into consideration the gravitation pull as well as SO many other things. (Newton just came up with the equation, we want to replicate this via AI)
Basically the dataset is frame by frame of how the orbital system develops, now looking at all of the frames in a long time.
We are able to learn how the orbits are related to one another aka we are able to find the equation. This is a very complex problem to solve.
Finding some kind of relationship, but when there are so many variables, there is a limit on the traditional method.
1. Natural Language Generation:
The Commercial State of the Art in 20202. This Entire Article Was Written by Open AI’s GPT2
3. Learning To Classify Images Without Labels
4. Becoming a Data Scientist, Data Analyst, Financial Analyst and Research Analyst
So in this paper, they use a graph neural network.
We are going to have the network learn the dataset, and in this case, the network will predict the movements via numbers.
Not spitting out an equation, but once we have the network that can describe the equation, we are able to analyze the network and find the underlying equation.
Holly shit! This is some BIG BRAIN shit, such as sex. My god we are doing math via deep learning basically finds the equation from data using neural network.
The network used in this paper is bit different.
In this paper, we match the number of vertices the same number of particles existing in the world. And every vertex has a property such as X, Y and Delta X, Delta Y, Mass, and more.
And the other property are edges, and they are all connected to one another, this is not a sparse graph. Interesting. The complexity on this shit must be high.
The graph is constructed in a way, where it can describe the movement of the particle in our universe.
And you can even calculate total force via individual sums, and this is a constraint, but thankfully most systems can be described that way, so it is realistic. And second of all, graph networks are made for this.
And again the model is predicting, see a frame, and predict where each particle would go!
The way we do this is by combining the connected vertex.
And there is a relationship between that two vertexes, and the deep learning model will able to learn that NON-linear relationship. This is perfect for neural networks.
Step one is to compute the edge messages, and step two is to compute the vertex message for prediction.
So the first step is to compute the edge message and the next step is to use those edge messages to compute the vertex message.
And finally, the loss is compared via the ground-truth value from the predicted value. This is traditional deep learning. Build the graph network, and use neural network to compute edge relationship, use that to compute vertex relationship.
And the network is shared between every edge and every vertex, the network itself is the same.
These guys are so fucking smart, they have built the full system into the network. And what was the result?
After finding the network, we need to do symbolic regression, basically, try out different equations that describe BEST the neural network.
So in very short way we are able to find the equation that is describing the system. This is much simpler solution yet the result is reasonable.
They use Eureqa package for symbolic regression.
Something like above?
The general framework can be seen above, and depending on the problem formulation, we can really go deeper for learning the relationship.
Not a physicist but it seems like for a simple relationship, the model was able to learn a reasonable equation.
Credit: BecomingHuman By: Jae Duk Seo