I had a similar crawl issue and it turned out my hosting was blocking Googlebot through some weird server-level rule. While checking logs, I did some data mining on my access files and spotted the blocked requests. Once I whitelisted Google’s user-agent in my server config, crawling started working again. Double-check if your server’s firewall or security tools are doing something similar.
Edit History
I had the same issue and found that my robot.txt file was blocking important folders. I edited it to allow bots and cleared my cache—Google started indexing my pages after that.
.png.022b5452a8f28f552bc9430097a16da2.png)