Forum websites are sometimes due to database load (since each page request execute the same SQL queries against the database backend)
However, I recommend you first check how to *best* use resume:
http://www.microsystools.com/products/sitemap-generator/help/sitemap-generator-resume-scan/
Then stop, save your project and then use resume later.
Consider dropping extended data:
http://www.microsystools.com/products/sitemap-generator/help/creating-sitemaps-large-websites/
(saves memory although with "just" e.g. 150000 URLs it *should* most often *not* be necessary)
...
Also, you may be able to cut down on URLs. E.g. if you don't need member pages? you could add URL filters for both *output* and *analysis*. Doing that will speed up your crawl and save memory. (Same goes if you can avoid e.g. duplicate URLs you don't really need to *analyze* and have *output* to sitemap)
http://www.microsystools.com/products/sitemap-generator/help/website-crawler-output-filters/
http://www.microsystools.com/products/sitemap-generator/help/website-crawler-scanner-filters/
(remember add URLs to both)
...
What I do when I need to handle such big websites that also may create multiple unwanted URLs leading to same content etc. I first take a few test scans (e.g. 1000 URLs) and see if there is something I don't want. It avoids situations where a scan takes forever and forever because of some unknown reason.
...
I am considering adding some more presets for common websites, e.g. wordpress, phpbb etc. Maybe I should prioritize getting those done
