Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

This repo contains a code that uses colabxterm and langchain community packages to install Ollama on Google Colab free tier T4 and pulls a model from Ollama and chats with it

License

Notifications You must be signed in to change notification settings

ParthaPRay/Ollama_GoogleColab_colabxterm_langchain

Open more actions menu

Repository files navigation

Ollama integration with Google Colab

  1. In your Google Colab notebook, install the necessary Python packages.

  2. Load the xterm extension to use a terminal within a Colab notebook.

  3. Open the terminal inside the Colab cell by running:

    %xterm

  4. nside the xterm terminal (opened within the Colab cell):

  • Type the following command to install Ollama:

    curl -fsSL https://ollama.com/install.sh | sh

  • Type the following command to install Ollama:

    ollama serve & ollama run llama3

  1. After leaving the xterm terminal, import the Ollama class from the LangChain community library.

  2. Use the llm.invoke() method to prompt the model and receive its response.

Reference: https://www.youtube.com/watch?v=LN9rlGNaXUA&ab_channel=AkashDawari

About

This repo contains a code that uses colabxterm and langchain community packages to install Ollama on Google Colab free tier T4 and pulls a model from Ollama and chats with it

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
Morty Proxy This is a proxified and sanitized view of the page, visit original site.