You may see the following error in the crawl log even though you have configured the SharePoint Search Service properly. In this case, search will interrupt the indexing.
Item not crawled due to one of the following reasons: Preventive crawl rule; Specified content source hops/depth exceeded; URL has query string parameter; Required protocol handler not found; Preventive robots directive.
As a solution to this situation, you can create and upload a robots.txt file to your site’s root address. To do this, write the following content into an empty notepad and save it as robots.txt.
User-agent: MS Search 6.0 Robot
To upload it to the root of your site, you must first connect to your site with SharePoint Designer. Then you can copy / paste to the same level as the document libraries by clicking the “All Items” link in the left menu.
After this, you can try to run full crawl again.