I am experimenting with machine learning techniques to solve partial differential equations (PDEs). My goal is to use solutions from previous time steps to predict the solution at the next time step, essentially treating the problem as a multistep method.
However, when I applied this approach to the one-dimensional Kuramoto-Sivashinsky equation, I did not observe any improvement in the predictions compared to using only the current time step.
This led me to wonder if the issue lies with the specific equation I chose. Is there a class of PDEs that is particularly well-suited for multistep methods in the context of machine learning? I am looking for equations where using information from previous time steps can significantly enhance the accuracy of the predictions.