Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove custom robots.txt and sitemap #4532

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

DimedS
Copy link
Member

@DimedS DimedS commented Feb 28, 2025

Description

Related to #3741.

We decided to use the default robots.txt and sitemap.xml from RTD, because we embedded the nonindex tag in all old docs versions, preventing search engines from indexing them.

See: #4516

Developer Certificate of Origin

We need all contributions to comply with the Developer Certificate of Origin (DCO). All commits must be signed off by including a Signed-off-by line in the commit message. See our wiki for guidance.

If your PR is blocked due to unsigned commits, then you must follow the instructions under "Rebase the branch" on the GitHub Checks page for your PR. This will retroactively add the sign-off to all unsigned commits and allow the DCO check to pass.

Checklist

  • Read the contributing guidelines
  • Signed off each commit with a Developer Certificate of Origin (DCO)
  • Opened this PR as a 'Draft Pull Request' if it is work-in-progress
  • Updated the documentation to reflect the code changes
  • Added a description of this change in the RELEASE.md file
  • Added tests to cover my changes
  • Checked if this change will affect Kedro-Viz, and if so, communicated that with the Viz team

@astrojuanlu
Copy link
Member

Thanks @DimedS ! Should we hold this off a few days to give Google some time to pick up the changes of #4516 ?

Copy link
Member

@astrojuanlu astrojuanlu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nah scratch that. Let's proceed with this. Old pages cannot pick up the noindex tag precisely because they're blocked by our custom robots.txt 👍🏼

@DimedS
Copy link
Member Author

DimedS commented Feb 28, 2025

Thanks @DimedS ! Should we hold this off a few days to give Google some time to pick up the changes of #4516 ?

Thanks, @astrojuanlu ! I also want to highlight that nothing will change after #4516 because the script we incorporated locally in our docs folder in that PR is not activated. I think we can ask RTD to activate it only after the release. Currently, the same script is still running from Google Cloud. Since this script has been running for over a week, I believe Google has already updated all the changes from it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants