Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

joshcarp/llm.go

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm.go

Status GitHub Issues GitHub Pull Requests License

GPT-2 implementation written in go only using the standard library.

🪞 Quick start

Install python dependencies, output tokenized dataset

make setup

Run the training script:

make train

This will run go run ./cmd/traingpt2/main.go

Run the testing script:

make test

This will run go run ./cmd/testgpt2/main.go

TODO

  • Tokenize input text, the implementation of this is incorrect. Need to do pair matching not tries
  • Very slow, need to improve performance.
  • It runs in WASM but using WebGPU bindings might be fun.
  • More refactoring.
  • Running as CLI.

🖋️ License

See LICENSE for more details.

🎉 Acknowledgements

  • This is a fork of Andrej Karpathy's llm.c written in pure go.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  
Morty Proxy This is a proxified and sanitized view of the page, visit original site.