Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests fail with Python 3.12 #263

Closed
gvanrossum opened this issue Feb 5, 2023 · 9 comments
Closed

Tests fail with Python 3.12 #263

gvanrossum opened this issue Feb 5, 2023 · 9 comments

Comments

@gvanrossum
Copy link
Member

Whenever I push something to a PR in the cpython main branch (3.12) I get an email from this repo telling me that the 3.12 run failed. Is there a way we can fix it? Here's a link to the failing run: https://github.com/python/pyperformance/actions/runs/4094033809/jobs/7059879010

I'd rather not just disable running with 3.12 (that would presumably take away some useful signal) but maybe we can fix or suppress the failing tests?

@AlexWaygood
Copy link
Member

I believe this PR by @hugovk would mean that the tests would still run, but wouldn't be reported as "failures" for now:

@gvanrossum
Copy link
Member Author

Yeah, that just disables 3.12 altogether, right? Isn't there a way to only disable those tests that depend on greenlet?

@gvanrossum
Copy link
Member Author

IIRC in the past we resolved a similar thing (for a different dependency) by changing the requirements.txt file to reference a specific PR (via some notation I've forgotten that lets you depend on a git repo/branch).

@hugovk
Copy link
Member

hugovk commented Feb 5, 2023

Yeah, that just disables 3.12 altogether, right?

Yes, the original intent of #259 was to run on 3.12, and if 3.12 fails, then don't fail the whole build. But I couldn't find the right continue-on-error incantation, so instead it disables the whole 3.12 job.

Isn't there a way to only disable those tests that depend on greenlet?

Yes, we can skip them when greelet isn't available. Please see PR #264.

@gvanrossum
Copy link
Member Author

Yeah, #264 feels better. I hesitate to approve it, since I am not familiar with this code base (nor with the sutbleties of GitHub workflows) but maybe @mdboom can take a quick look.

@corona10
Copy link
Member

corona10 commented Apr 19, 2023

@gvanrossum I proposed another PR that does not touch pyperformance code directly but managing by GHA meta data.

PR: #274 (Expect as working correctly after the PR is merged)
Sample: corona10#1 (comment)

@corona10
Copy link
Member

corona10 commented Apr 19, 2023

Ah https://github.com/python/pyperformance/pull/259/files#r1096738419 looks similar approach but I didn't notice it

@hugovk
Copy link
Member

hugovk commented Apr 19, 2023

Looks like the key difference between those, you have continue-on-error: ${{ matrix.experimental }} at the job level, I had it at the step level. Nice work!

corona10 added a commit that referenced this issue Apr 19, 2023
corona10 added a commit to corona10/pyperformance that referenced this issue Apr 20, 2023
@corona10
Copy link
Member

Close this issue :) If the issue is needed to be reopened, feel free to open the issue again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants