Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

response_schema not work for generate_content_stream #14

Copy link
Copy link
@hjlarry

Description

@hjlarry
Issue body actions

Thanks for stopping by to let us know something could be better!

PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.

Is this a client library issue or a product issue? We will only be able to assist with issues that pertain to the behaviors of this library. If the issue you're experiencing is due to the behavior of the product itself, please visit the Support page to reach the most relevant engineers.

If the support paths suggested above still do not result in a resolution, please provide the following details.

Environment details

  • Programming language: Python
  • OS: windows local develop env
  • Language runtime version: python 3.12
  • Package version: google-genai-0.2.2

Steps to reproduce

I follow the tutorial of json_schema, but use the stream mode, will raise error.

This is my code:

response = client.models.generate_content_stream(
    model='gemini-2.0-flash-exp',
    contents='Give me information of the United States.',
    config={
        'response_mime_type': 'application/json',
        'response_schema': {
            'properties': {
                'name': {'type': 'STRING'},
                'population': {'type': 'INTEGER'},
                'capital': {'type': 'STRING'},
                'continent': {'type': 'STRING'},
                'gdp': {'type': 'INTEGER'},
                'official_language': {'type': 'STRING'},
                'total_area_sq_mi': {'type': 'INTEGER'},
            },
            'type': 'OBJECT',
        },
    },
)

for r in response:
    parts = r.candidates[0].content.parts
    for p in parts:
        print(p.text)

This is the error:

Traceback (most recent call last):
  File "C:\Users\hejl\PycharmProjects\dify\api\.venv\Lib\site-packages\google\genai\models.py", line 3723, in generate_content_stream
    return_value = types.GenerateContentResponse._from_response(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\hejl\PycharmProjects\dify\api\.venv\Lib\site-packages\google\genai\types.py", line 2464, in _from_response
    result.parsed = json.loads(result.text)
                    ^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\hejl\.pyenv\pyenv-win\versions\3.12.0\Lib\json\__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\hejl\.pyenv\pyenv-win\versions\3.12.0\Lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\hejl\.pyenv\pyenv-win\versions\3.12.0\Lib\json\decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
               ^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)

if I request with client.models.generate_content it works, but I need the stream, Thanks!

Reactions are currently unavailable

Metadata

Metadata

Assignees

Labels

priority: p1Important issue which blocks shipping the next release. Will be fixed prior to next release.Important issue which blocks shipping the next release. Will be fixed prior to next release.type: bugError or flaw in code with unintended results or allowing sub-optimal usage patterns.Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions

    Morty Proxy This is a proxified and sanitized view of the page, visit original site.