Benchmarking web page performance
I currently work in QA at a small software company that develops web based training software. Most of the work I do is manual usability testing but recently I've been tasked with performance testing pages and features of the site as it has been running very slowly in certain areas. Currently the way I go about this is by timing the page loads either manually using the stopwatch on my phone or through the chrome devtools before and after any performance optimisation has been implemented.
I am 99% sure there must be a much better and more precise way of doing this and would really appreciate someone pointing me in the right direction as I'm struggling on researching this question on my own.
Firstly, I would say that the Chrome devtools is a reasonable strategy, if you perform the test more or less the same way and care only for an overall improvement in performance.
But if you want to scale or have more precise, you can use Sitespeed.io
You can just make a shell call and pass the correct parameters:
docker run --rm -v "$(pwd)":/sitespeed.io sitespeedio/sitespeed.io:9.8.1 https://www.sitespeed.io/
And it will give you plenty of information about the page:
Note that it considers page rendering time, differently from JMeter or Gatling, which are primarily service-focus tools.
You can also script more complex flows that a simple page landing.