Skip to content

Commit

Permalink
Run style check
Browse files Browse the repository at this point in the history
  • Loading branch information
joone committed Mar 18, 2024
1 parent 1f58138 commit f9c1410
Show file tree
Hide file tree
Showing 13 changed files with 75 additions and 47 deletions.
1 change: 0 additions & 1 deletion .github/FUNDING.yml
Original file line number Diff line number Diff line change
@@ -1,2 +1 @@
github: [joone]

25 changes: 12 additions & 13 deletions .github/workflows/node.js.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,12 @@ name: Node.js CI

on:
push:
branches: [ "main", "cicd"]
branches: ["main", "cicd"]
pull_request:
branches: [ "main" ]
branches: ["main"]

jobs:
build:

runs-on: ubuntu-latest

strategy:
Expand All @@ -20,13 +19,13 @@ jobs:
# See supported Node.js release schedule at https://nodejs.org/en/about/releases/

steps:
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- run: sudo apt-get update && sudo apt-get install -y git
- run: npm ci
- run: npm run build --if-present
- run: OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }} npm test
- uses: actions/checkout@v3
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
cache: "npm"
- run: sudo apt-get update && sudo apt-get install -y git
- run: npm ci
- run: npm run build --if-present
- run: OPENAI_API_KEY=${{ secrets.OPENAI_API_KEY }} npm test
12 changes: 10 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,27 @@
# Changelog
### [v0.3.0]((https://github.com/joone/loz/compare/v0.2.13...v0.3.0)) - 2024-02-24

### [v0.3.0](<(https://github.com/joone/loz/compare/v0.2.13...v0.3.0)>) - 2024-02-24

- **Added**
- Run Linux commands based on user prompts. Users can now execute Linux commands using natural language. For example, by running `loz "find the largest file in the current directory"`,
`Loz` will interpret the instruction and execute the corresponding Linux commands like `find . -type f -exec ls -l {} + | sort -k 5 -nr | head -n 1` to find the largest file.
`Loz` will interpret the instruction and execute the corresponding Linux commands like `find . -type f -exec ls -l {} + | sort -k 5 -nr | head -n 1` to find the largest file.

### [v0.2.13](https://github.com/joone/loz/compare/v0.2.12...v0.2.13) - 2024-02-22

- **Added**
- Enhanced Git Commit Formatting: Commit messages are now structured with a clear separation between the title and body, improving readability and adherence to Git best practices.

## [v0.2.12](https://github.com/joone/loz/compare/v0.2.11...v0.2.12) - 2024-02-15

- **Added**
- Add support for all models compatible with Ollama

## [v0.2.11](https://github.com/joone/loz/compare/v0.2.10...v0.2.11) - 2024-02-13

- **Added**
- Store OpenAI API Key in `config.json` ([#17](https://github.com/joone/loz/pull/17), contributed by @honeymaro)

## v0.2.0 - 2024-02-01

- **Added**
- Add support for llama2 and codellama models via ollama integration.
26 changes: 24 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,24 @@
# Loz [![NPM](https://img.shields.io/npm/v/chatgpt.svg)](https://www.npmjs.com/package/loz)

![alt Loz Demo](https://github.com/joone/loz/blob/main/examples/loz_demo.gif?raw=true)

Loz is a command-line tool that enables your preferred LLM to execute system commands and utilize Unix pipes, integrating AI capabilities with other Unix tools.

## What's New

### v0.3.0 - 2024-02-24

- **Added**
- Run Linux commands based on user prompts. Users can now execute Linux commands using natural language. For example, by running `loz "find the largest file in the current directory"`,
`Loz` will interpret the instruction and execute the corresponding Linux commands like `find . -type f -exec ls -l {} + | sort -k 5 -nr | head -n 1` to find the largest file. See more [examples](#examples).
`Loz` will interpret the instruction and execute the corresponding Linux commands like `find . -type f -exec ls -l {} + | sort -k 5 -nr | head -n 1` to find the largest file. See more [examples](#examples).

### v0.2.13 - 2024-02-22

- **Added**
- Enhanced Git Commit Formatting: Commit messages are now structured with a clear separation between the title and body, improving readability and adherence to Git best practices.

### v0.2.12 - 2024-02-15

- **Added**
- Add support for all models compatible with Ollama

Expand Down Expand Up @@ -42,6 +49,7 @@ $ ./install.sh
Loz supports [OpenAI API](https://platform.openai.com/docs/quickstart?context=node) and [Ollama](https://github.com/ollama/ollama) so you can switch between these LLM services easily, using the `config` command in the interactive mode.

### Set up Ollama

To utilize Ollama on your local system, you'll need to install both llama2 and codellama models. Here's how you can do it on a Linux system:

```
Expand Down Expand Up @@ -90,14 +98,17 @@ Choose your LLM service: (ollama, openai)
```

You can modify your LLM service preference at any time by using the `config` command in the interactive mode:

```
> config api openai
```

Additionally, you can change the model by entering:

```
> config model llama2
```

or

```
Expand All @@ -111,6 +122,7 @@ You can check the current settings by entering:
api: ollama
model: llama2
```

Currently, gpt-3.5-turbo and all models provided by Ollama are supported.

### Interactive mode
Expand All @@ -122,16 +134,20 @@ $ loz
Once loz is running, you can start a conversation by interacting with it. loz will respond with a relevant message based on the input.

### Run Linux Commands with Loz

Loz empowers users to execute Linux commands using natural language. Below are some examples demonstrating how `loz`'s LLM backend translates natural language into Linux commands:

#### Examples

- Find the largest file in the current directory:

```
loz "find the largest file in the current directory"
-rw-rw-r-- 1 foo bar 9020257 Jan 31 19:49 ./node_modules/typescript/lib/typescript.js
```

- Check if Apache2 is running:

```
loz "check if apache2 is running on this system"
● apache2.service - The Apache HTTP Server
Expand All @@ -142,22 +158,28 @@ Loz empowers users to execute Linux commands using natural language. Below are s
loz "Detect GPUs on this system"
00:02.0 VGA compatible controller: Intel Corporation Device a780 (rev 04)
```
For your information, this feature has only been tested with the OpenAI API.
For your information, this feature has only been tested with the OpenAI API.

#### Caution

To prevent unintentional system modifications, avoid running commands that can alter or remove system files or configurations, such as `rm`, `mv`, `rmdir`, or `mkfs`.

#### Safe Mode

To enhance security and avoid unintended command execution, loz can be run in Safe Mode. When activated, this mode requires user confirmation before executing any Linux command.

Activate Safe Mode by setting the LOZ_SAFE=true environment variable:

```
LOZ_SAFE=true loz "Check available memory on this system"
```

Upon execution, loz will prompt:

```
Do you want to run this command?: free -h (y/n)
```

Respond with 'y' to execute the command or 'n' to cancel. This feature ensures that you have full control over the commands executed, preventing accidental changes or data loss.

### Pipe mode
Expand Down
12 changes: 6 additions & 6 deletions src/config/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,13 +25,13 @@ export class ConfigItem implements ConfigItemInterface {
}

export const requestApiKey = async (
rl: readlinePromises.Interface
rl: readlinePromises.Interface,
): Promise<string> => {
for (const key of ["LOZ_OPENAI_API_KEY", "OPENAI_API_KEY"]) {
const value = process.env[key];
if (value) {
const useApiKeyFromEnv = await rl.question(
`\n${key} found in environment variables. Do you want to use it? (y/n) `
`\n${key} found in environment variables. Do you want to use it? (y/n) `,
);
if (useApiKeyFromEnv.toLowerCase() === "y") {
return value;
Expand All @@ -52,7 +52,7 @@ export const requestApiKey = async (
};

const requestApiName = async (
rl: readlinePromises.Interface
rl: readlinePromises.Interface,
): Promise<string> => {
const res = await rl.question("Choose your LLM service: (ollama, openai) ");
if (!["ollama", "openai"].includes(res)) {
Expand Down Expand Up @@ -87,12 +87,12 @@ export class Config implements ConfigInterface {
if (value === "openai")
this.setInternal(
"model",
this.get("openai.model")?.value || DEFAULT_OPENAI_MODEL
this.get("openai.model")?.value || DEFAULT_OPENAI_MODEL,
);
else if (value === "ollama")
this.setInternal(
"model",
this.get("ollama.model")?.value || DEFAULT_OLLAMA_MODEL
this.get("ollama.model")?.value || DEFAULT_OLLAMA_MODEL,
);
else {
console.log("Invalid API");
Expand Down Expand Up @@ -158,7 +158,7 @@ export class Config implements ConfigInterface {
if (name === "ollama") {
this.set("model", DEFAULT_OLLAMA_MODEL);
console.log(
`\nYou should install ${name} with llama2 and codellama models: see https://ollama.ai/download \n`
`\nYou should install ${name} with llama2 and codellama models: see https://ollama.ai/download \n`,
);
} else if (name === "openai") {
this.set("model", DEFAULT_OPENAI_MODEL);
Expand Down
8 changes: 4 additions & 4 deletions src/git/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ export class Git {
return;
}
resolve(stdout);
}
},
);
});
}
Expand All @@ -42,7 +42,7 @@ export class Git {
return;
}
resolve(stdout);
}
},
);
});
}
Expand All @@ -63,7 +63,7 @@ export class Git {
return;
}
resolve(stdout);
}
},
);
});
}
Expand All @@ -84,7 +84,7 @@ export class Git {
return;
}
resolve(stdout);
}
},
);
});
}
Expand Down
2 changes: 1 addition & 1 deletion src/history/test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ describe("ChatHistoryManager", () => {

// Read the file and check its content
const savedData = JSON.parse(
fs.readFileSync(path.join(testConfigPath, files[0]), "utf8")
fs.readFileSync(path.join(testConfigPath, files[0]), "utf8"),
);
expect(savedData.dialogue.length).to.equal(1);
expect(savedData.dialogue[0]).to.deep.equal({
Expand Down
2 changes: 1 addition & 1 deletion src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ async function handleCodeDiffFromPipe(): Promise<void> {
}

process.stdout.write(
completion.content + "\n\nGenerated by " + completion.model + "\n"
completion.content + "\n\nGenerated by " + completion.model + "\n",
);
});
}
Expand Down
4 changes: 2 additions & 2 deletions src/llm/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ export class OpenAiAPI extends LLMService {
}

public async completion(
params: LLMSettings
params: LLMSettings,
): Promise<{ content: string; model: string }> {
if (DEBUG) {
console.log("OpenAI completion");
Expand Down Expand Up @@ -93,7 +93,7 @@ export class OllamaAPI extends LLMService {
}

public async completion(
params: LLMSettings
params: LLMSettings,
): Promise<{ content: string; model: string }> {
if (DEBUG) {
console.log("Ollama completion");
Expand Down
6 changes: 3 additions & 3 deletions src/loz.ts
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ export class Loz {
if (DEBUG) console.log(result);
if (result.indexOf("ollama") === -1) {
console.log(
"Please install ollama with llama2 and codellama first: see https://ollama.ai/download \n"
"Please install ollama with llama2 and codellama first: see https://ollama.ai/download \n",
);
process.exit(1);
}
Expand Down Expand Up @@ -152,7 +152,7 @@ export class Loz {

try {
await this.git.commit(
complete.content + "\n\nGenerated by " + complete.model
complete.content + "\n\nGenerated by " + complete.model,
);
const commitHEAD = await this.git.showHEAD();
console.log("\n# Generated commit message: \n");
Expand Down Expand Up @@ -362,7 +362,7 @@ export class Loz {
output: process.stdout,
});
answer = await rl.question(
`Do you want to run this command?: ${linuxCommand} (y/n) `
`Do you want to run this command?: ${linuxCommand} (y/n) `,
);
rl.close();
} catch (error) {
Expand Down
16 changes: 8 additions & 8 deletions test/command.test.ts
Original file line number Diff line number Diff line change
Expand Up @@ -38,22 +38,22 @@ describe("Linux Command Test", () => {
// lspci | grep -i vga
it("Detect GPUs on this system", function () {
let stdout = execSync(
`MOCHA_ENV=test node ${LOZ_BIN} "Detect GPUs on this system"`
`MOCHA_ENV=test node ${LOZ_BIN} "Detect GPUs on this system"`,
).toString();
expect(stdout).to.include("VGA compatible controller");
});

if (GITHUB_ACTIONS === false) {
it("Run find . -type f -exec ls -l {} + | sort -k 5 -nr | head -n 1", function () {
let stdout = execSync(
`MOCHA_ENV=test node ${LOZ_BIN} "find the largest file in the current directory"`
`MOCHA_ENV=test node ${LOZ_BIN} "find the largest file in the current directory"`,
).toString();
expect(stdout).to.include("typescript.js");
});

it("Run systemctl status apache2", function () {
let stdout = execSync(
`MOCHA_ENV=test node ${LOZ_BIN} "check if apache2 is runnig on this system"`
`MOCHA_ENV=test node ${LOZ_BIN} "check if apache2 is runnig on this system"`,
).toString();
expect(stdout).to.include("The Apache HTTP Server");
});
Expand All @@ -62,33 +62,33 @@ describe("Linux Command Test", () => {
// Get the system's current date and time
it("Get the system's current date and time", function () {
let stdout = execSync(
`MOCHA_ENV=test node ${LOZ_BIN} "Get the current date and time on this system"`
`MOCHA_ENV=test node ${LOZ_BIN} "Get the current date and time on this system"`,
).toString();
if (GITHUB_ACTIONS === false) {
// Fri Feb 23 10:57:41 PM PST 2024
expect(stdout).to.match(
/\w{3} \w{3} \s?\d{1,2} \d{2}:\d{2}:\d{2} (AM|PM) \w{3} \d{4}/
/\w{3} \w{3} \s?\d{1,2} \d{2}:\d{2}:\d{2} (AM|PM) \w{3} \d{4}/,
);
} else {
// Sat Feb 24 06:49:11 UTC 2024
expect(stdout).to.match(
/\w{3} \w{3} \s?\d{1,2} \d{2}:\d{2}:\d{2} UTC \d{4}/
/\w{3} \w{3} \s?\d{1,2} \d{2}:\d{2}:\d{2} UTC \d{4}/,
);
}
});

// Check available memory
it("Check available memory on this system", function () {
let stdout = execSync(
`MOCHA_ENV=test node ${LOZ_BIN} "Check available memory on this Linux"`
`MOCHA_ENV=test node ${LOZ_BIN} "Check available memory on this Linux"`,
).toString();
expect(stdout).to.match(/Mem:/);
});

// grep 'sfsfsfcf' *
it("Handle no output", function () {
let stdout = execSync(
`MOCHA_ENV=test node ${LOZ_BIN} "Find sfsdfef text in files in the current directory"`
`MOCHA_ENV=test node ${LOZ_BIN} "Find sfsdfef text in files in the current directory"`,
).toString();
expect(stdout).to.match(/No output/);
});
Expand Down
Loading

0 comments on commit f9c1410

Please sign in to comment.