… it’s easily fixed.
You might have received a message like this, lots of people have:
Most websites have a robots.txt file (your developer will know about it if you don’t). This tells Google what to crawl and what not to crawl. It is basically a list of allowed and disallowed URLs. Google sees this and does not crawl the disallowed URLs.
So with a WordPress website a robot.txt that looks like this will certainly need to have the items in red removed:
Yoast SEO, arguably the best SEO plugin for WordPress, now recommend a completely empty robots.txt file, more on that here »
In summary, keeping the Googlebot happy is just a case of removing a few lines of code from a file. Nothing that cannot be quickly fixed by your developer!