Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

alloy-org/pro-ai

Open more actions menu
 
 

Repository files navigation

AmpleAI Plugin

AmpleAI Plugin is an Amplenote plugin that adds OpenAI & Ollama interactivity with Amplenote.

Recent History

Installation

  1. Clone this repo. git clone git@github.com:alloy-org/openai-plugin.git
  2. Install node and npm if you haven't already.
  3. Run npm install to install the packages.
  4. Copy .env.example to .env and fill in the environment variable for your OpenAI key

Testing

To run a specific test file, use: NODE_OPTIONS=--experimental-vm-modules npm test -- test/plugin.test.js

For Cursor & LLMs to invoke a test file, the following needs to run OUTSIDE SANDBOX MODE, otherwise the tests will fail with "Error: Cannot find module '@jest/test-sequencer'"

cd /Users/bill/src/ai-plugin && NODE_OPTIONS=--experimental-vm-modules npm test -- test/path-to-file.test.js

And to invoke a test file while only running the test whose name includes "llama":

cd /Users/bill/src/ai-plugin && NODE_OPTIONS=--experimental-vm-modules npm test -- test/path-to-file.test.js  -t "llama"

Or to run allll the tests, NODE_OPTIONS=--experimental-vm-modules npm test

Testing a single test

npm test -- -t "should allow appOption freeform Q&A" test/plugin.test.js

Skipping Local LLM Tests

If you don't have Ollama running locally, you can skip the local LLM tests (which test Mistral and other Ollama models) by setting the LOCAL_MODELS environment variable to suspended:

LOCAL_MODELS=suspended NODE_OPTIONS=--experimental-vm-modules npm test

This will prevent test failures from local model tests when Ollama is not running.

Testing with JetBrains

https://public.amplenote.com/F4rghypGZSXEjjFLiXQTxxcR

Run tests continuously as modifying the plugin

NODE_OPTIONS=--experimental-vm-modules npm run test -- --watch

Technologies used to help with this project

Run Ollama

OLLAMA_ORIGINS=https://plugins.amplenote.com ollama serve

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • JavaScript 100.0%
Morty Proxy This is a proxified and sanitized view of the page, visit original site.