I thinks this depends on your crawler and performance settings. I am not aware of /CrawlStartSite.html or /CrawlStartExpert.html setting time restrictions to submit new crawls. So if you submit many new crawls in a short time, I guess you would eventually end up with your crawl queue being full because reaching memory limits. The best is even to test this.
Wow, 17 minutes, to only insert a crawl job? Something may have gone wrong. On my YaCy peer running only with 600MB RAM on a 2,4GHz processor this only takes a few secons to add a new crawl job with the form... Do you use some options other than defaults?
Ok, maybe this would be more efficient to start a crawl job with a list of start URLs instead of starting multiple crawl jobs... By the way, there is probably room for performance improvements so it may be valuable to create a mantis issue. Did you noticed at which number of crawljobs it started to become unreasonnably long to insert new jobs?