Mais oui, mon cher,...c'est evident.
Thanks for the fast reply.... Homework assignment 'done.'
Of course, it is not an error, per se.
However, the real Question #1 is, of course, 'how much extra memory is "enough" for processing an external, cloud hosted, html file that large (53+ megabytes, generated by a previous but ~ sick ~ YaCy server) if we use the /CrawlStartExpert.html for an input door on a new and healthy server?
Question #2, please, is where - other than the underlying cloud platform - is that new and extra memory applied inside the replacement, healthy, YaCy (where it is needed), please?
Somewhere in the crawler is unquestionably the correct place and possibly it is necessary to use a mix of settings for a cloud environment.
I am sure someone knows the current 'best practices' guidance on this. ...ha ha... Many, Many Thanks