Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Add compatibility for aliyun LLM APIs #530

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 13, 2025

Conversation

ThorbJ
Copy link
Contributor

@ThorbJ ThorbJ commented Mar 5, 2025

Summary

Some Aliyun API responses do not include the status_details field, which causes an error when trying to access it. This PR ensures compatibility by setting status_details to null when it is missing.

Changes

  • Checks if status_details exists in the response before accessing it.
  • Defaults status_details to null if it is not present.
  • Ensures compatibility with both OpenAI and Aliyun APIs.

Why this is needed

Without this fix, Aliyun API responses that lack status_details will cause runtime errors, breaking the client integration.

Testing

  • Tested with OpenAI API (unchanged behavior).
  • Tested with Aliyun API (no more errors when status_details is missing).

Backward Compatibility

✅ Fully backward compatible. If status_details is present, the behavior remains unchanged.

@iBotPeaches
Copy link
Collaborator

Apologies. I can't tell if this passed CI or not since it decayed out. I believe this is fine to merge since the existing types have it as nullable, but I can't confirm till the CI passes.

If you rebase I should be able to confirm.

@iBotPeaches iBotPeaches marked this pull request as draft April 8, 2025 22:33
@iBotPeaches iBotPeaches added the Non-OpenAI Model Not an OpenAI Model. label Apr 10, 2025
@ThorbJ ThorbJ force-pushed the fix-aliyun-compatibility branch from 8f6d4ea to dd01377 Compare April 13, 2025 10:07
@ThorbJ ThorbJ marked this pull request as ready for review April 13, 2025 10:24
@ThorbJ
Copy link
Contributor Author

ThorbJ commented Apr 13, 2025

Apologies. I can't tell if this passed CI or not since it decayed out. I believe this is fine to merge since the existing types have it as nullable, but I can't confirm till the CI passes.

If you rebase I should be able to confirm.

Hi, I've rebased the branch and marked the PR as ready for review. The workflows are currently awaiting approval since this PR comes from a fork. Could you please approve and run the workflows? Thanks!

@iBotPeaches
Copy link
Collaborator

looks good. I'm mobile right now but just want to confirm we have a test with this property removed to prevent regressions. It might already exist, but didn't check.

@iBotPeaches
Copy link
Collaborator

I added a test - just waiting on CI to confirm.

Copy link
Collaborator

@iBotPeaches iBotPeaches left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since status_details could be null, I feel safe to hit parity with WebUI/Aliyun to merge this. We just improve our handling to get to the null outcome.

@iBotPeaches iBotPeaches merged commit bb305b0 into openai-php:main Apr 13, 2025
10 checks passed
@iBotPeaches
Copy link
Collaborator

thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Non-OpenAI Model Not an OpenAI Model.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.