Hi,
I am running A1SG (always latest pro version) on a 2generation i7 with 8gb RAM and find that historically A1SG eats all available memory with larger projects, meaning large as > 100K URLs

(this snapshot comes from this problem happening while scanning france-voyage.com)
I know I have settings for making querying and memory usage lighter, but since I use this (great) tool for SEO purposes I need the closest to real info, specially from missconfigured or tricky servers (the ones that may give problems if I enable the suggested in thread http://webhelpforums.net/index.php?topic=4283.0)
This memory issue has a workaround: stop the crawler, let it get all pending requests and save the Project. Then mark the option for continue where left and keep going for next batch, and so on (mucho so on if large site!)
Reverse lecture on this workaround: can't leave it working while picking kids from school (an example), if I forget or rely too much on the time allowed to run, I may find myself with a stuck laptop and have to interrupt the A1SG process from Task Admin, then all process time and collected data go to waste.
Is it an A1SG bug? Is is anything you can fix in next versions?
I'd be happy enough with serious alternates such as:
- Autosaving settings
- Memory limit triggers (say stop&save, save and prompt, data batch like those ".part" compressed packs being assempled at final stage)
- Whatever you can think of

Thanks for the atention