There is this long run theory in the SEO world that Google will sometimes make up URLs with the attempt to try and discover, crawl and index pages they can’t find otherwise. If you think of that theory, it kind of seems inefficient and un-Google like – but hey, that is why we have theories.
Well, John Mueller from Google pretty much said that Google does not do that. He said in a Google Webmaster Help thread “in general, Googlebot doesn’t make up URLs.” In general, of course, he said in general. But again, if you think about it, why would Google go through the effort?
In general, Googlebot doesn’t make up URLs, so if they were found (and not in your sitemap file), it’s likely that they were linked from somewhere within your website (it’s also possible that these links have since been removed).
I would assume, most of these cases are people linking to you weird or broken links that Google follows from other sites or your own internal site.
Either way, do you think Google tries to crawl URLs that they make up out of the blue?
Forum discussion at Google Webmaster Help.