Skip to content

Commit

Permalink
Merge pull request #29 from ajcwebdev/next
Browse files Browse the repository at this point in the history
More JSDoc Work and Docker Update
  • Loading branch information
ajcwebdev authored Sep 30, 2024
2 parents 1b37e68 + aa1ba7b commit 4825dfd
Show file tree
Hide file tree
Showing 29 changed files with 816 additions and 517 deletions.
73 changes: 73 additions & 0 deletions .github/workflows/scorecard.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.

name: Scorecard supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '15 11 * * 5'
push:
branches: [ "main" ]

# Declare default permissions as read only.
permissions: read-all

jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed to upload the results to code-scanning dashboard.
security-events: write
# Needed to publish results and get a badge (see publish_results below).
id-token: write
# Uncomment the permissions below if installing in a private repository.
# contents: read
# actions: read

steps:
- name: "Checkout code"
uses: actions/checkout@b4ffde65f46336ab88eb53be808477a3936bae11 # v4.1.1
with:
persist-credentials: false

- name: "Run analysis"
uses: ossf/scorecard-action@0864cf19026789058feabb7e87baa5f140aac736 # v2.3.1
with:
results_file: results.sarif
results_format: sarif
# (Optional) "write" PAT token. Uncomment the `repo_token` line below if:
# - you want to enable the Branch-Protection check on a *public* repository, or
# - you are installing Scorecard on a *private* repository
# To create the PAT, follow the steps in https://github.com/ossf/scorecard-action?tab=readme-ov-file#authentication-with-fine-grained-pat-optional.
# repo_token: ${{ secrets.SCORECARD_TOKEN }}

# Public repositories:
# - Publish results to OpenSSF REST API for easy access by consumers
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
# For private repositories:
# - `publish_results` will always be set to `false`, regardless
# of the value entered here.
publish_results: true

# Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF
# format to the repository Actions tab.
- name: "Upload artifact"
uses: actions/upload-artifact@97a0fba1372883ab732affbe8f94b823f91727db # v3.pre.node20
with:
name: SARIF file
path: results.sarif
retention-days: 5

# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: results.sarif
3 changes: 2 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -7,4 +7,5 @@ package-lock.json
src/llms/models
.idea
build
deno.lock
deno.lock
out
139 changes: 38 additions & 101 deletions docs/examples.md
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --assembly
Include speaker labels and number of speakers:

```bash
npm run as -- --video "https://ajc.pics/audio/fsjam-short.mp3" --assembly --speaker-labels --speakers-expected 2
npm run as -- --video "https://ajc.pics/audio/fsjam-short.mp3" --assembly --speakerLabels
```

### Whisper.cpp
Expand All @@ -267,19 +267,18 @@ npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisper me
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisper large
```

Run `whisper.cpp` in a Docker container with `--whisper-docker`:
Run `whisper.cpp` in a Docker container with `--whisperDocker`:

```bash
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisper-docker tiny
npm run as -- --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base
```

## Docker Compose

This will run both `whisper.cpp` and the AutoShow Commander CLI in their own Docker containers.

```bash
docker-compose up --build -d
docker-compose run autoshow --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisper-docker base
docker-compose run autoshow --video "https://www.youtube.com/watch?v=MORMZXEaONk" --whisperDocker base
```

Currently working on the `llama.cpp` Docker integration so the entire project can be encapsulated in one local Docker Compose file.
Expand Down Expand Up @@ -380,100 +379,38 @@ npm run test-local
This can be a useful way of creating a single markdown file of the entire project for giving to an LLM as context to develop new features or debug code. I'll usually start a conversation by including this along with a prompt that explains what I want changed or added.

```bash
cat README.md >> LLM.md && \
echo '\n\n```' >> LLM.md && \
tree >> LLM.md && \
echo '\n```' >> LLM.md && \
echo '\n\n' >> LLM.md && \
cat docs/examples.md >> LLM.md && \
echo '\n## AutoShow CLI Entry Point' >> LLM.md && \
echo '\n\n```js' >> LLM.md && \
cat src/autoshow.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '## Utility Functions\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/utils/cleanUpFiles.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/utils/downloadAudio.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/utils/generateMarkdown.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/utils/runLLM.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/utils/runTranscription.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '## Transcription Functions\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/transcription/whisper.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/transcription/deepgram.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/transcription/assembly.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '## LLM Functions\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/prompt.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/chatgpt.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/claude.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/cohere.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/gemini.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/llama.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/llamacpp.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/mistral.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/llms/octo.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '## Process Commands\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/commands/processVideo.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/commands/processURLs.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/commands/processRSS.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/commands/processPlaylist.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```js' >> LLM.md && \
cat src/commands/processFile.js >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '## Docker Files\n' >> LLM.md && \
echo '```Dockerfile' >> LLM.md && \
cat .github/whisper.Dockerfile >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```Dockerfile' >> LLM.md && \
cat .github/llama.Dockerfile >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```Dockerfile' >> LLM.md && \
cat Dockerfile >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```yml' >> LLM.md && \
cat docker-compose.yml >> LLM.md && \
echo '\n```\n' >> LLM.md && \
echo '```bash' >> LLM.md && \
cat docker-entrypoint.sh >> LLM.md && \
echo '\n```\n' >> LLM.md
export MD="LLM.md" && export COMMANDS="src/commands" && export UTILS="src/utils" && \
export LLMS="src/llms" && export TRANSCRIPT="src/transcription" && \
export OPEN="\n\n\`\`\`js" && export CLOSE="\n\`\`\`\n\n" && cat README.md >> $MD && \
echo '\n\n### Directory and File Structure\n\n```' >> $MD && tree >> $MD && \
echo '```\n\n## Example CLI Commands Test Suite'$OPEN'' >> $MD && cat test/all.test.js >> $MD && \
echo ''$CLOSE'## JSDoc Types'$OPEN'' >> $MD && cat src/types.js >> $MD && \
echo ''$CLOSE'## AutoShow CLI Entry Point'$OPEN'' >> $MD && cat src/autoshow.js >> $MD && \
echo ''$CLOSE'## Utility Functions\n\n### Generate Markdown'$OPEN'' >> $MD && cat $UTILS/generateMarkdown.js >> $MD && \
echo ''$CLOSE'### Download Audio'$OPEN'' >> $MD && cat $UTILS/downloadAudio.js >> $MD && \
echo ''$CLOSE'### Run Transcription'$OPEN'' >> $MD && cat $UTILS/runTranscription.js >> $MD && \
echo ''$CLOSE'### Run LLM'$OPEN'' >> $MD && cat $UTILS/runLLM.js >> $MD && \
echo ''$CLOSE'### Clean Up Files'$OPEN'' >> $MD && cat $UTILS/cleanUpFiles.js >> $MD && \
echo ''$CLOSE'## Process Commands\n\n### Process Video'$OPEN'' >> $MD && cat $COMMANDS/processVideo.js >> $MD && \
echo ''$CLOSE'### Process Playlist'$OPEN'' >> $MD && cat $COMMANDS/processPlaylist.js >> $MD && \
echo ''$CLOSE'### Process URLs'$OPEN'' >> $MD && cat $COMMANDS/processURLs.js >> $MD && \
echo ''$CLOSE'### Process RSS'$OPEN'' >> $MD && cat $COMMANDS/processRSS.js >> $MD && \
echo ''$CLOSE'### Process File'$OPEN'' >> $MD && cat $COMMANDS/processFile.js >> $MD && \
echo ''$CLOSE'## Transcription Functions\n\n### Call Whisper'$OPEN'' >> $MD && cat $TRANSCRIPT/whisper.js >> $MD && \
echo ''$CLOSE'### Call Deepgram'$OPEN'' >> $MD && cat $TRANSCRIPT/deepgram.js >> $MD && \
echo ''$CLOSE'### Call Assembly'$OPEN'' >> $MD && cat $TRANSCRIPT/assembly.js >> $MD && \
echo ''$CLOSE'## LLM Functions\n\n### Prompt Function'$OPEN'' >> $MD && cat $LLMS/prompt.js >> $MD && \
echo ''$CLOSE'### Call ChatGPT'$OPEN'' >> $MD && cat $LLMS/chatgpt.js >> $MD && \
echo ''$CLOSE'### Call Claude'$OPEN'' >> $MD && cat $LLMS/claude.js >> $MD && \
echo ''$CLOSE'### Call Cohere'$OPEN'' >> $MD && cat $LLMS/cohere.js >> $MD && \
echo ''$CLOSE'### Call Gemini'$OPEN'' >> $MD && cat $LLMS/gemini.js >> $MD && \
echo ''$CLOSE'### Call Llama.cpp'$OPEN'' >> $MD && cat $LLMS/llama.js >> $MD && \
echo ''$CLOSE'### Call Mistral'$OPEN'' >> $MD && cat $LLMS/mistral.js >> $MD && \
echo ''$CLOSE'### Call Octo'$OPEN'' >> $MD && cat $LLMS/octo.js >> $MD && \
echo ''$CLOSE'## Docker Files\n\n```Dockerfile' >> $MD && cat .github/whisper.Dockerfile >> $MD && \
echo ''$CLOSE'```Dockerfile' >> $MD && cat .github/llama.Dockerfile >> $MD && \
echo ''$CLOSE'```Dockerfile' >> $MD && cat Dockerfile >> $MD && \
echo ''$CLOSE'```yml' >> $MD && cat docker-compose.yml >> $MD && \
echo ''$CLOSE'```bash' >> $MD && cat docker-entrypoint.sh >> $MD && \
echo '\n```\n' >> $MD
```
Loading

0 comments on commit 4825dfd

Please sign in to comment.