Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
Discussion options

Hello, how hard is it to add support for a new LLM model api? A quick explainer about the required methods that need to be exposed for an adapter classes would be really nice.

For reference, I want to use a transformer based on this library, x-transformers. https://github.com/lucidrains/x-transformers

The way to generate with this library:

from x_transformers import (
    Decoder,
    TransformerWrapper,
    AutoregressiveWrapper
)

model = TransformerWrapper(
    num_tokens = 4,
    numerical_token_id = 3,
    max_seq_len = 1024,
    attn_layers = Decoder(
        dim = 512,
        depth = 12,
        heads = 8
    )
)


model = AutoregressiveWrapper(model)

# mock data
ids = torch.randint(0, 4, (1, 777))

# train on a lot of data above

loss = model(ids)
loss.backward()

# then generate
start_ids = torch.randint(0, 4, (1, 1))

ids_out, num_out, is_number_mask = model.generate(start_ids,  17)

The precise way the model is sampled is visible in the AutoregressiveWrapper's generate function at https://github.com/lucidrains/x-transformers/blob/main/x_transformers/autoregressive_wrapper.py

I suspect that it should be straight forward to create an "OutlinesAutoregressiveWrapper" class that provides the right methods, if there is documentation on how to do this. Any one have an idea on how to accomplish this?

You must be logged in to vote

Replies: 3 comments · 1 reply

Comment options

You should check out the v1.0 branch and the models module there. This will become the default version in a couple of weeks. @RobinPicard should be able to give more pointers.

You must be logged in to vote
0 replies
Comment options

Are you meaning the base classes for a model type adapter? I don't see anything specifically for x_transformers yet.

https://github.com/dottxt-ai/outlines/blob/v1.0/outlines/models/base.py

You must be logged in to vote
1 reply
@rlouf
Comment options

Yes that's what I meant. If x_transformers follows the transformers API exactly you should be able to rely on the Transformers class and only implement a from_xtransformers function in a new x_transformers.py module.

Comment options

Hi @swamidass! We do not support this library yet. If you want to try creating a model for it, you should subclass Model and ModelTypeAdapter from the outlines/models/base.py file. If the library is compatible with Outlines, it should be rather straightforward as you can draw inspiration from other models.

If it works out and this library is used by other people, we could then consider adding it to Outlines.

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
3 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.