You kinda can... A tool like WGET will let you download a given web page to a file, and has options for recursing the entire site, whether you want to traverse links that lead off-site, how deep to go, etc. A GUI front-end tool like
https://sourceforge.net/projects/winwget/ will help you set and remember the options, which will take some tinkering to get correct.
Once you test with crawling just a few pages and think you have it right, you probably want to rate limit how fast it tries to crawl the entire site. Or else the site owner might notice a spike and think you're some kind of bad bot or DDoS attack and block you.
The end result is that you'll have a bunch of local HTML files representing all the pages, and if the options were set right, they'll have relative links that let you click through that local snapshot as though you're browsing the live site.
Yeah, sounds like too much work to me, too. Unfortunately that forum software -- and definitely not his old version -- does not have any option to easily dump the entire site to a friendly XML file or similar.