Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings
This repository was archived by the owner on Jun 27, 2024. It is now read-only.

KevCui/perplexity-cli

Open more actions menu

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 

Repository files navigation

perplexity-cli

Chat with Perplexity AI in terminal

Table of Contents

Dependency

Installation

npm i playwright-chromium playwright-extra puppeteer-extra-plugin-stealth
npx playwright install chromium

How to use

  • perplexity.js fetches result from https://www.perplexity.ai/:
$ ./perplexity.js "enter any text here"
  • perplexity-labs.js fetches result from https://labs.perplexity.ai/:
$ ./perplexity-labs.js "enter any text here"

LLM sonar-medium-online is selected by default. Set the second argument as model name to select another model, for example: ./perplexity-labs.js "text" "codellama-70b-instruct".

Want an alternative?

Check out copilot-cli

Note

This script is designed to handle only one question and one answer at a time. The response is provided in plain text format, making it well-suited for quick answer in the terminal. It is not designed for a polished conversational experience, but rather for efficient command line usage with Perplexity AI.


Buy Me A Coffee

About

💬 Chat with Perplexity AI in terminal

Topics

Resources

Stars

Watchers

Forks

Morty Proxy This is a proxified and sanitized view of the page, visit original site.