I've used an explicit finite difference scheme to model the 1D time dependent temperature distribution in a friction weld. I want to now verify the consistency and convergence of my algorithm.
I have no exact solution and assume I must use an approximation produced at a very refined mesh spacing to calculate the relative error for the coarser meshes. I plan to calculate the $L^2$ and $L^\infty$ error norms at a number of time intervals (during both heating and cooling). I am under the impression that the $L^2$ norm will provide the best overall description of the error, whereas the $L^\infty$ will allow me to bound my error.
So, is this is a logical and robust method by which to estimate my error and confirm the consistency of my difference scheme?