Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Failed transcriptions stuck in queue #79

Open
LegateLaurie opened this issue Jan 20, 2024 · 6 comments
Open

[BUG] Failed transcriptions stuck in queue #79

LegateLaurie opened this issue Jan 20, 2024 · 6 comments

Comments

@LegateLaurie
Copy link

LegateLaurie commented Jan 20, 2024

Hi, I've had some files which have failed to transcribe and that have produced error logs in the whishper_data\logs directory but haven't had an error message on the web client and so aren't able to be deleted from the queue, and display permanently as waiting for transcription. The same happened with the queued translation.

image

The backend.err.log file for one of these is just: "�[90m3:42AM�[0m �[32mINF�[0m Starting monitor!", and the rest are essentially identical.

I'm not sure what's caused the error with these, but attempting to process the same file again, either for translation or captioning, seems to work.

Environment

I'm on Windows using WSL2 Ubuntu and am using Whishper 3.1.2

Logs and Configuration

Docker Compose Logs

whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
whisper-libretranslate  | Updating language models
whisper-libretranslate  | Found 86 models
whisper-libretranslate  | Keep 4 models
whisper-libretranslate  | Loaded support for 3 languages (4 models total)!
whisper-libretranslate  | Running on http://0.0.0.0:5000
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"CONTROL",  "id":4784927, "ctx":"SignalHandler","msg":"Shutting down the HealthLog"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"CONTROL",  "id":4784928, "ctx":"SignalHandler","msg":"Shutting down the TTL monitor"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"INDEX",    "id":3684100, "ctx":"SignalHandler","msg":"Shutting down TTL collection monitor thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"INDEX",    "id":3684101, "ctx":"SignalHandler","msg":"Finished shutting down TTL collection monitor thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"CONTROL",  "id":6278511, "ctx":"SignalHandler","msg":"Shutting down the Change Stream Expired Pre-images Remover"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"CONTROL",  "id":4784929, "ctx":"SignalHandler","msg":"Acquiring the global lock for shutdown"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"CONTROL",  "id":4784930, "ctx":"SignalHandler","msg":"Shutting down the storage engine"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22320,   "ctx":"SignalHandler","msg":"Shutting down journal flusher thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22321,   "ctx":"SignalHandler","msg":"Finished shutting down journal flusher thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22322,   "ctx":"SignalHandler","msg":"Shutting down checkpoint thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22323,   "ctx":"SignalHandler","msg":"Finished shutting down checkpoint thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22261,   "ctx":"SignalHandler","msg":"Timestamp monitor shutting down"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":20282,   "ctx":"SignalHandler","msg":"Deregistering all the collections"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22317,   "ctx":"SignalHandler","msg":"WiredTigerKVEngine shutting down"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22318,   "ctx":"SignalHandler","msg":"Shutting down session sweeper thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.212+00:00"},"s":"I",  "c":"STORAGE",  "id":22319,   "ctx":"SignalHandler","msg":"Finished shutting down session sweeper thread"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:23.237+00:00"},"s":"I",  "c":"STORAGE",  "id":4795902, "ctx":"SignalHandler","msg":"Closing WiredTiger","attr":{"closeConfig":"leak_memory=true,use_timestamp=false,"}}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.902+00:00"},"s":"I",  "c":"STORAGE",  "id":4795901, "ctx":"SignalHandler","msg":"WiredTiger closed","attr":{"durationMillis":4665}}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.902+00:00"},"s":"I",  "c":"STORAGE",  "id":22279,   "ctx":"SignalHandler","msg":"shutdown: removing fs lock..."}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.903+00:00"},"s":"I",  "c":"-",        "id":4784931, "ctx":"SignalHandler","msg":"Dropping the scope cache for shutdown"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.903+00:00"},"s":"I",  "c":"FTDC",     "id":20626,   "ctx":"SignalHandler","msg":"Shutting down full-time diagnostic data capture"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.905+00:00"},"s":"I",  "c":"CONTROL",  "id":20565,   "ctx":"SignalHandler","msg":"Now exiting"}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.905+00:00"},"s":"I",  "c":"CONTROL",  "id":8423404, "ctx":"SignalHandler","msg":"mongod shutdown complete","attr":{"Summary of time elapsed":{"Statistics":{"Enter terminal shutdown":"0 ms","Step down the replication coordinator for shutdown":"0 ms","Time spent in quiesce mode":"0 ms","Shut down FLE Crud subsystem":"0 ms","Shut down MirrorMaestro":"0 ms","Shut down WaitForMajorityService":"0 ms","Shut down the logical session cache":"0 ms","Shut down the transport layer":"1 ms","Shut down the global connection pool":"0 ms","Shut down the flow control ticket holder":"0 ms","Kill all operations for shutdown":"0 ms","Shut down all tenant migration access blockers on global shutdown":"0 ms","Shut down all open transactions":"0 ms","Acquire the RSTL for shutdown":"0 ms","Shut down the IndexBuildsCoordinator and wait for index builds to finish":"0 ms","Shut down the replica set monitor":"0 ms","Shut down the migration util executor":"0 ms","Shut down the health log":"0 ms","Shut down the TTL monitor":"0 ms","Shut down expired pre-images and documents removers":"0 ms","Shut down the storage engine":"4691 ms","Wait for the oplog cap maintainer thread to stop":"0 ms","Shut down full-time data capture":"0 ms","shutdownTask total elapsed time":"4694 ms"}}}}
mongo-1                 | {"t":{"$date":"2024-01-16T15:06:27.905+00:00"},"s":"I",  "c":"CONTROL",  "id":23138,   "ctx":"SignalHandler","msg":"Shutting down","attr":{"exitCode":0}}
mongo-1                 |
mongo-1                 | MongoDB init process complete; ready for start up.
mongo-1                 |
mongo-1                 | {"t":{"$date":"2024-01-16T15:18:14.547Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-16T15-18-14"}}
mongo-1                 | {"t":{"$date":"2024-01-16T16:31:13.323Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-16T16-31-13"}}
mongo-1                 | {"t":{"$date":"2024-01-17T09:40:47.768Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-17T09-40-47"}}
mongo-1                 | {"t":{"$date":"2024-01-17T11:21:00.396Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-17T11-21-00"}}
mongo-1                 | {"t":{"$date":"2024-01-19T21:10:41.749Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-19T21-10-41"}}
mongo-1                 | {"t":{"$date":"2024-01-20T01:16:26.203Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T01-16-26"}}
mongo-1                 | {"t":{"$date":"2024-01-20T02:22:32.055Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T02-22-32"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:12:01.305Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-12-01"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:27:11.924Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-27-11"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:27:22.520Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-27-22"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:27:31.987Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-27-31"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:27:40.184Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-27-40"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:27:50.724Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-27-50"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:28:02.312Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-28-02"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:28:14.579Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-28-14"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:28:25.793Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-28-25"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:28:33.986Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-28-33"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:28:42.668Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-28-42"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:28:53.600Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-28-53"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:29:03.493Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-29-03"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:29:15.090Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-29-15"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:29:28.898Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-29-28"}}
mongo-1                 | {"t":{"$date":"2024-01-20T03:32:24.494Z"},"s":"I",  "c":"CONTROL",  "id":20697,   "ctx":"main","msg":"Renamed existing log file","attr":{"oldLogPath":"/var/log/mongodb/mongod.log","newLogPath":"/var/log/mongodb/mongod.log.2024-01-20T03-32-24"}}
whishper                | 2024-01-20 03:10:31,627 INFO waiting for backend, frontend, nginx, transcription to die
whishper                | 2024-01-20 03:10:32,061 INFO stopped: transcription (exit status 0)
whishper                | 2024-01-20 03:10:32,092 INFO stopped: nginx (exit status 0)
whishper                | 2024-01-20 03:10:32,101 WARN stopped: frontend (terminated by SIGTERM)
whishper                | 2024-01-20 03:10:32,210 WARN stopped: backend (terminated by SIGTERM)
whishper                | 2024-01-20 03:12:01,169 CRIT Supervisor is running as root.  Privileges were not dropped because no user is specified in the config file.  If you intend to run as root, you can set user=root in the config file to avoid this message.
whishper                | 2024-01-20 03:12:01,172 INFO supervisord started with pid 1
whishper                | 2024-01-20 03:12:02,176 INFO spawned: 'backend' with pid 7
whishper                | 2024-01-20 03:12:02,179 INFO spawned: 'frontend' with pid 8
whishper                | 2024-01-20 03:12:02,182 INFO spawned: 'nginx' with pid 9
whishper                | 2024-01-20 03:12:02,185 INFO spawned: 'transcription' with pid 10
whishper                | 2024-01-20 03:12:03,312 INFO success: backend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:12:03,312 INFO success: frontend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:12:03,312 INFO success: nginx entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:12:03,312 INFO success: transcription entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:26:42,715 WARN received SIGTERM indicating exit request
whishper                | 2024-01-20 03:26:42,715 INFO waiting for backend, frontend, nginx, transcription to die
whishper                | 2024-01-20 03:26:42,980 INFO stopped: transcription (exit status 0)
whishper                | 2024-01-20 03:26:42,981 INFO stopped: nginx (exit status 0)
whishper                | 2024-01-20 03:26:42,985 WARN stopped: frontend (terminated by SIGTERM)
whishper                | 2024-01-20 03:26:42,987 WARN stopped: backend (terminated by SIGTERM)
whishper                | 2024-01-20 03:27:12,200 CRIT Supervisor is running as root.  Privileges were not dropped because no user is specified in the config file.  If you intend to run as root, you can set user=root in the config file to avoid this message.
whishper                | 2024-01-20 03:27:12,201 INFO supervisord started with pid 1
whishper                | 2024-01-20 03:27:13,206 INFO spawned: 'backend' with pid 7
whishper                | 2024-01-20 03:27:13,210 INFO spawned: 'frontend' with pid 8
whishper                | 2024-01-20 03:27:13,214 INFO spawned: 'nginx' with pid 9
whishper                | 2024-01-20 03:27:13,222 INFO spawned: 'transcription' with pid 11
whishper                | 2024-01-20 03:27:14,953 INFO success: backend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:27:14,953 INFO success: frontend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:27:14,953 INFO success: nginx entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:27:14,953 INFO success: transcription entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:29:40,111 WARN received SIGTERM indicating exit request
whishper                | 2024-01-20 03:29:40,112 INFO waiting for backend, frontend, nginx, transcription to die
whishper                | 2024-01-20 03:29:40,349 INFO stopped: transcription (exit status 0)
whishper                | 2024-01-20 03:29:40,350 INFO stopped: nginx (exit status 0)
whishper                | 2024-01-20 03:29:40,353 WARN stopped: frontend (terminated by SIGTERM)
whishper                | 2024-01-20 03:29:40,354 WARN stopped: backend (terminated by SIGTERM)
whishper                | 2024-01-20 03:32:24,349 CRIT Supervisor is running as root.  Privileges were not dropped because no user is specified in the config file.  If you intend to run as root, you can set user=root in the config file to avoid this message.
whishper                | 2024-01-20 03:32:24,351 INFO supervisord started with pid 1
whishper                | 2024-01-20 03:32:25,355 INFO spawned: 'backend' with pid 7
whishper                | 2024-01-20 03:32:25,358 INFO spawned: 'frontend' with pid 8
whishper                | 2024-01-20 03:32:25,361 INFO spawned: 'nginx' with pid 9
whishper                | 2024-01-20 03:32:25,364 INFO spawned: 'transcription' with pid 10
whishper                | 2024-01-20 03:32:26,565 INFO success: backend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:32:26,565 INFO success: frontend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:32:26,565 INFO success: nginx entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:32:26,565 INFO success: transcription entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)
whishper                | 2024-01-20 03:42:02,439 WARN exited: backend (terminated by SIGKILL; not expected)
whishper                | 2024-01-20 03:42:02,775 INFO spawned: 'backend' with pid 56
whishper                | 2024-01-20 03:42:03,951 INFO success: backend entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)

Docker Compose File

version: "3.9"

services:
  mongo:
    image: mongo
    env_file:
      - .env
    restart: unless-stopped
    volumes:
      - ./whishper_data/db_data:/data/db
      - ./whishper_data/db_data/logs/:/var/log/mongodb/
    environment:
      MONGO_INITDB_ROOT_USERNAME: ${DB_USER:-whishper}
      MONGO_INITDB_ROOT_PASSWORD: ${DB_PASS:-whishper}
    expose:
      - 27017
    command: ['--logpath', '/var/log/mongodb/mongod.log']

  translate:
    container_name: whisper-libretranslate
    image: libretranslate/libretranslate:latest
    restart: unless-stopped
    volumes:
      - ./whishper_data/libretranslate/data:/home/libretranslate/.local/share
      - ./whishper_data/libretranslate/cache:/home/libretranslate/.local/cache
    env_file:
      - .env
    tty: true
    environment:
      LT_DISABLE_WEB_UI: True
      LT_UPDATE_MODELS: True
    expose:
      - 5000
    networks:
      default:
        aliases:
          - translate
    healthcheck:
      test: ['CMD-SHELL', './venv/bin/python scripts/healthcheck.py']
      interval: 2s
      timeout: 3s
      retries: 5

  whishper:
    pull_policy: always
    image: pluja/whishper:${WHISHPER_VERSION:-latest}
    env_file:
      - .env
    volumes:
      - ./whishper_data/uploads:/app/uploads
      - ./whishper_data/logs:/var/log/whishper
    container_name: whishper
    restart: unless-stopped
    networks:
      default:
        aliases:
          - whishper
    ports:
      - 8082:80
    depends_on:
      - mongo
      - translate
    environment:
      PUBLIC_INTERNAL_API_HOST: "http://127.0.0.1:80"
      PUBLIC_TRANSLATION_API_HOST: ""
      PUBLIC_API_HOST: ${WHISHPER_HOST:-}
      PUBLIC_WHISHPER_PROFILE: cpu
      WHISPER_MODELS_DIR: /app/models
      UPLOAD_DIR: /app/uploads
      CPU_THREADS: 4

I am able to edit the db files in order to remove those queued processes, but maybe adding a clear button on the webui would solve this. I apologise if I've missed anything useful to mention. Thanks

@hmu-duc-anh
Copy link

Hi, please show me how to manually remove failed transcriptions stuck in queue, thank you!

@LegateLaurie
Copy link
Author

LegateLaurie commented Jan 22, 2024

For me it was relatively complicated but I think it's worth it if you already use mongodb. I've since edited the docker-compose.yml so that under the mongo config instead of "expose 27017" I have "ports:
- 27017:27017"
This section of the file looks like this
`services:

mongo:

image: mongo
env_file:
  - .env
restart: unless-stopped
volumes:
  - ./whishper_data/db_data:/data/db
  - ./whishper_data/db_data/logs/:/var/log/mongodb/
environment:
  MONGO_INITDB_ROOT_USERNAME: ${DB_USER:-whishper}
  MONGO_INITDB_ROOT_PASSWORD: ${DB_PASS:-whishper}
ports:
  - 27017:27017
command: ['--logpath', '/var/log/mongodb/mongod.log']`

(this lets you connect to the mongodb)

I then ran "docker-compose down" and then "docker-compose up" in my terminal (make sure you're in the same directory as the docker-compose.yml). After that I'm using MongoDB Compass (there is other software that'll do this, either cli or gui, but Compass is free) and enter "mongodb://whishper:whishper@localhost:27017/". From there as long as the docker is running I can edit the whishper
.transcriptions database by just deleting the entries for the failed processes.

@hmu-duc-anh
Copy link

Thank you, it's really complicated for me tho. This is not my specialty, I just know very basic programming, I struggled last night just to install stuff and enable GPU. Perhaps I will wait for the clear button feature

@JDeepix
Copy link

JDeepix commented Feb 23, 2024

Had the same problem, but wasn't able to access the database from outside. So without exposing the port you can access the database through entering the docker of mongodb:

docker ps -a
(find the container id of the whisper-mongo)
docker exec -u 0 -it IDFROMABOVE /bin/bash
(now you are inside the docker)

mongosh "mongodb://whishper:[email protected]:27017/"
Use whishper

with
Db.transcriptions.deleteMany({})
you can clear everything.
Don't know how to just clear the translation hickup though.

@Starius65
Copy link

How can you tell they've failed?

@LegateLaurie
Copy link
Author

I have no idea the cause of the failures, the error codes are obviously not that helpful and the trying to process the same video file again I've had successes, so I don't think it's anything to do with the file I'm transcribing. Maybe just randomness with the model, or the file being corrupted while loading or any number of issues. I've personally been using a different program - SubtitleEdit - which has essentially greater functionality and is more feature dense in terms of subtitle editing, It also supports the latest Whisper forks, etc, the same as this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants