Robot.txt - pretty much empty after 'creation'

Started by cwallace, January 06, 2019, 11:47:04 AM


When I went to 'create robot.txt' it created a file that only has a sinlge line for sitemap entries and it refers to a 'video' format one...which I don't even have video on my site so why it would take that one and not the base index I don't know...

This is what I get:

User-agent: *

Sitemap: http://<domain name>.com/sitemap-video-OSH.xml

I have 15 or more sitemap files...I would expect to see a bigger list or as I have found with some research a main 'index' sitemap to include only...

Anyway...I would like to create a nice valid robot.txt file, but I keep getting nothing for the most part.



I also noticed several created sitemaps are HUGE in has nearly 500k entries and google is kicking it back telling me to break it down to 50k per file...

However, I am not sure that google interprets those sitemaps and are even valid in the sitemaps for google...

Can you point me to where I might sort out WHAT files should be uploaded and entered into the google search concole sitemap submitter??

Does that make sense??

I do 'create all' and it creates 35 sitemap files...I upload them and then submit to google.

A lot of them have errors...should I not be using them or are these errors in my configuration or setup before sending?

Things like missing XML tag (i.e. there was no URL tag as there was no content in a specific 'video' sitemap). Should I just delete it?

This is still one the best software purchases I have ever made in regard to SEO!!

Thanks in advance...



Sorry for the delay (email support was still within 24 hours though, but the forum took an unintended backseat for a while)

There are many of the sitemap file types you should not submit to Google, e.g. HTML sitemaps but there are quite a few more that are not meant to be submitted to Google (as they used for entirely different purposes)

if you simply submit all generated then you will get lots of errors and warnings about XML etc. :)

Concerning sitemaps differences see this:

But if you do not have video or image content you should simply generate and upload a standard XML sitemap.

I suggest you do not use "Build all", but instead select XML sitemap in the dropdown of choices and click "Build selected". (And maybe clear the output directory first.) That way you avoid uploading old/wrong files to Google Webmaster Tools.

If your website is large, A1 Sitemap Generator will default to split it across multiple files including creating a sitemap index file:

From your robots.txt question I suspect you may have configured output paths wrongly (but I may be wrong - I will need to see the project file) - for reference here is the online help:

If I can see your project file, I will have a much better chance of helping, so please do this:

1) Make sure you are using a recent version of the software
2) Email your project file ("File | Save As... Which generates a ".ini" file you can send)
3) Report examples of errors in Google Webmaster Tools if they are still there (Note: sometimes resubmitting and waiting a day helps) - the more specific the better
TechSEO360 |  | A1 Sitemap Generator, A1 Website Analyzer etc.


Hello. I have been absent a while, but have sent an e-mail in regard to this post. I am sure something might trigger activity again. :)

I am obviously in no rush. Still using the software as I have...just made some changes and have done some testing and benchmarking stuff.



Hi Chris,

We emailed some days ago back and forth - in my last email I asked for further information - did you miss it by any chance?

TechSEO360 |  | A1 Sitemap Generator, A1 Website Analyzer etc.


Checked my e-mail and even spam and saw nothing. I am assuming coming from the info@micr...



See you replied to my emails, so we will continue there :)
TechSEO360 |  | A1 Sitemap Generator, A1 Website Analyzer etc.


You can remove unwanted code from Robot.txt file. Add sitemaps into one single robot file for making it easy for crawlers.


QuoteMy robot.txt has nothing in it at all except default line: User-agent: *

QuoteI am having all sorts of situations with google now. Saying sitemaps are invalid and unreadable...

WebHelpForums here... In posting an answer to this post (see below) I by accident somehow overwrote this post by cwallace with the answer. Having seen this now I have restored what I could.


QuoteMy robot.txt has nothing in it at all except default line: User-agent: *

Which should not cause problems. If you use A1SG to generate robots.txt see:
If you feel robots.txt ic causing your problem, you can always create an (e.g. empty) robots.txt file yourself or delete it :)

QuoteI am having all sorts of situations with google now. Saying sitemaps are invalid and unreadable...

From the description it could be anything like:

  • Server causing issues (MIME/similar, server HTTP responses etc.)
  • Wrong files submitted (e.g. if you assigned HTML sitemap output to be the same as XML sitemap file)
  • You submitted non-XML sitemaps protocol files
  • Submitting XML sitemaps containing URLs for a different domain

Let us locate the problem!

  • Does the problem persist if you regenerate/upload/submit again? When was it last checked?
  • What is the URL of your standard XML sitemap file? (if you prefer email that is completely fine as well)
  • What is the exact Google error for your standard XML sitemap file.
  • What version are you using of A1 Sitemap Generator?
TechSEO360 |  | A1 Sitemap Generator, A1 Website Analyzer etc.

More About Our Webmaster Tools for Windows and Mac