django-robots: sitemap not being included in robots.txt

I have tried both default discovery as well as specifying it manually using the ROBOTS_SITEMAP_URLS setting but I am still unable to get the site map to show up on the robots.txt

About this issue

  • Original URL
  • State: closed
  • Created 11 years ago
  • Comments: 30 (12 by maintainers)

Most upvoted comments

@SalahAdDin 3.0rc1 has been released on Pypi. I’m going to check if everything is ok and I will release 3.0 final in a couple of days (sooner if you could confirm rc1 is fine! 😉 )

@mheppner 🤦‍♂️

@myusuf3 For the record, that type of snark is counter-productive and I think you should know better. Please be so kind and stop giving this type of feedback.

4 years guys.

I know this is a pretty old issue, but I wanted to clarify that reverse() does not pick up on the sitemap if it’s behind cache_page() in your URL conf. It might be helpful for others to update the documentation. In my case, I was using a main index (django.contrib.sitemaps.views.index) which linked off to the sections (django.contrib.sitemaps.sitemap), so I just took off the caching for the index and django-robots could then reverse to it. The sections can still be cached.

Alternatively, if you just had 1 sitemap, it seems to work fine if you create an alias view and don’t use cache_page() directly in your URL conf:

# views.py
from django.contrib.sitemaps import views as sitemaps_views
from django.views.decorators.cache import cache_page

@cache_page(10 * 60)
def sitemap_index(*args, **kwargs):
    return sitemaps_views.index(*args, **kwargs)

Same here. Doesn’t resolve or appear in “robots.txt”.