Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Squeezelite allocates more than 80 MB of reserved memory and gets killed by OOM Killer #165

Open
famigliabormidavigna opened this issue Sep 4, 2022 · 3 comments

Comments

@famigliabormidavigna
Copy link

famigliabormidavigna commented Sep 4, 2022

While starting on Debian Linux on a very small ARM SBC (SheevaPlug based Seagate Dockstar) with only 128 MB of physical memory, Squeezelite allocates up to 80 MB of reserved memory then OOM Killer kicks in and kills it.

Squeezelite reported version is

Squeezelite v1.9.9-1403, Copyright 2012-2015 Adrian Smith, 2015-2021 Ralph Irving. See -t for license terms

and dmesg shows this:

[10638.459186][  T954] squeezelite invoked oom-killer: gfp_mask=0x100cca(GFP_HIGHUSER_MOVABLE), order=0, oom_score_adj=0
[10638.470060][  T954] CPU: 0 PID: 954 Comm: squeezelite Not tainted 5.18.6-kirkwood-tld-1 #1.0 480b168af3f03711133a59c80899744b6fc1b46b
[10638.483064][  T954] Hardware name: Marvell Kirkwood (Flattened Device Tree)
[10638.490178][  T954]  unwind_backtrace from show_stack+0x10/0x14
[10638.496247][  T954]  show_stack from dump_header+0x50/0x230
[10638.502053][  T954]  dump_header from oom_kill_process+0x74/0x26c
[10638.508283][  T954]  oom_kill_process from out_of_memory+0x320/0x378
[10638.514777][  T954]  out_of_memory from __alloc_pages+0x918/0xab0
[10638.521156][  T954]  __alloc_pages from handle_mm_fault+0x59c/0x8a0
[10638.527540][  T954]  handle_mm_fault from __get_user_pages+0xf8/0x3a4
[10638.534120][  T954]  __get_user_pages from populate_vma_page_range+0x50/0x64
[10638.541437][  T954]  populate_vma_page_range from __mm_populate+0xf8/0x124
[10638.548447][  T954]  __mm_populate from sys_mlockall+0x98/0xd0
[10638.554411][  T954]  sys_mlockall from ret_fast_syscall+0x0/0x44
[10638.561052][  T954] Exception stack(0x89141fa8 to 0x89141ff0)
[10638.567007][  T954] 1fa0:                   00446a14 7ef33e38 00000003 00000000 7522ed00 00000000
[10638.576144][  T954] 1fc0: 00446a14 7ef33e38 00000000 00000098 00444000 7ef33e20 00448cb0 00000003
[10638.585256][  T954] 1fe0: 751b69f0 7ef3375c 0041e000 751b69fc
[10638.591140][  T954] Mem-Info:
[10638.594189][  T954] active_anon:0 inactive_anon:10 isolated_anon:0
[10638.594189][  T954]  active_file:199 inactive_file:242 isolated_file:0
[10638.594189][  T954]  unevictable:20565 dirty:0 writeback:0
[10638.594189][  T954]  slab_reclaimable:2155 slab_unreclaimable:1542
[10638.594189][  T954]  mapped:15968 shmem:0 pagetables:248 bounce:0
[10638.594189][  T954]  kernel_misc_reclaimable:0
[10638.594189][  T954]  free:765 free_pcp:1 free_cma:0
[10638.634665][  T954] Node 0 active_anon:0kB inactive_anon:40kB active_file:796kB inactive_file:1088kB unevictable:82260kB isolated(anon):0kB isolated(file):0kB mapped:63932kB dirty:0kB writeback:0kB shmem:0kB writeback_tmp:0kB kernel_stack:744kB pagetables:992kB all_unreclaimable? no
[10638.660034][  T954] Normal free:2920kB boost:0kB min:1260kB low:1572kB high:1884kB reserved_highatomic:0KB active_anon:0kB inactive_anon:40kB active_file:796kB inactive_file:1088kB unevictable:82260kB writepending:0kB present:131072kB managed:110484kB mlocked:82260kB bounce:0kB free_pcp:16kB local_pcp:16kB free_cma:0kB
[10638.688647][  T954] lowmem_reserve[]: 0 0
[10638.692875][  T954] Normal: 212*4kB (UME) 115*8kB (UME) 34*16kB (UME) 13*32kB (UE) 3*64kB (UE) 0*128kB 0*256kB 0*512kB 0*1024kB 0*2048kB 0*4096kB = 2920kB
[10638.707015][  T954] 16222 total pagecache pages
[10638.711812][  T954] 7 pages in swap cache
[10638.715931][  T954] Swap cache stats: add 27769, delete 27762, find 4620/16647
[10638.723287][  T954] Free swap  = 2056772kB
[10638.727891][  T954] Total swap = 2068476kB
[10638.732218][  T954] 32768 pages RAM
[10638.735849][  T954] 0 pages HighMem/MovableOnly
[10638.740578][  T954] 5147 pages reserved
[10638.744498][  T954] Tasks state (memory values in pages):
[10638.749986][  T954] [  pid  ]   uid  tgid total_vm      rss pgtables_bytes swapents oom_score_adj name
[10638.759745][  T954] [    217]     0   217    11439      265    32768      152          -250 systemd-journal
[10638.769648][  T954] [    227]     0   227     1079       71    12288       35             0 blkmapd
[10638.779179][  T954] [    237]     0   237     4760      128    16384      150         -1000 systemd-udevd
[10638.788983][  T954] [    300]     0   300      753       75    10240       42             0 rpc.idmapd
[10638.798479][  T954] [    312]   111   312     1893       94    14336       87             0 rpcbind
[10638.808005][  T954] [    314]     0   314     8396      110    22528      231             0 dhclient
[10638.817341][  T954] [    322]   103   322     1723      152    14336       76             0 avahi-daemon
[10638.827034][  T954] [    323]   102   323     1928       86    14336       93          -900 dbus-daemon
[10638.837860][  T954] [    324]   103   324     1684       46    14336       62             0 avahi-daemon
[10638.847544][  T954] [    326]     0   326     5294      111    22528      131             0 systemd-logind
[10638.858313][  T954] [    397]     0   397     1228       75    12288       94             0 rpc.mountd
[10638.867806][  T954] [    410]     0   410     3116       96    18432      167         -1000 sshd
[10638.876852][  T954] [    427]     0   427     1103       76    12288       23             0 agetty
[10638.885970][  T954] [    428]     0   428      722       81    10240       19             0 syslogd
[10638.895189][  T954] [    431]     0   431     1645       70    14336       23             0 agetty
[10638.904450][  T954] [    701]     0   701     3348      302    20480      212             0 sshd
[10638.913394][  T954] [    707]     0   707     2019      124    14336       84             0 bash
[10638.922321][  T954] [    875]     0   875     2132      134    14336       67             0 tmux: client
[10638.932103][  T954] [    877]     0   877     2315      330    16384      247             0 tmux: server
[10638.941747][  T954] [    878]     0   878     2071      234    16384       91             0 bash
[10638.950684][  T954] [    894]     0   894     2012      128    14336       84             0 bash
[10638.960104][  T954] [    910]     0   910     2224      288    18432      171             0 htop
[10638.969196][  T954] [    954]     0   954    34291    20565   122880        0             0 squeezelite
[10638.978873][  T954] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),task=squeezelite,pid=954,uid=0
[10638.988837][  T954] Out of memory: Killed process 954 (squeezelite) total-vm:137164kB, anon-rss:19296kB, file-rss:62964kB, shmem-rss:0kB, UID:0 pgtables:120kB oom_score_adj:0

Please note under Arch Linux running on the same SBC, Squeezelite v1.9.8-1344 allocates less than 9 MB reserved memory.

What did made the memory footprint become so huge? Compile options are the same for both versions.

@famigliabormidavigna famigliabormidavigna changed the title Squeezelite gets allocate more than 80 MB of reserved memory and gets killed by OOM Killer Squeezelite allocates more than 80 MB of reserved memory and gets killed by OOM Killer Sep 4, 2022
@tmancill
Copy link

tmancill commented Sep 5, 2022

Hi @famigliabormidavigna . I maintain the Debian packaging of squeezelite and am curious to understand what is happening here. Note from your output above, you are compiling the software yourself and not using the Debian squeezelite package, for which the latest version is v1.9.9-1395.

First a question. Why are you building for Debian but testing a different version on squeezelite for Arch Linux? Is the memory usage equivalent for squeezelite v1.9.9-1403 on Arch Linux?

Assuming that the difference in memory usage is due to the OS distribution and not related to version of squeezelite, the difference could be in the libraries squeezelite is linked against. You can start by looking the version of glibc and whether you are using an alternative malloc implementation, for example, jemalloc on Arch. In order to investigate further, please provide:

  1. the output of ldd -v /usr/bin/squeezelite for builds on both operating systems
  2. the versions of both operating systems you are using
  3. your build options for the Debian version so they can be compared to those used for Arch Linux
  4. the output of pmap -x $squeezelitepid on both operating systems

I am assuming all other variables are the same - that is, you are using squeezelite to stream from the same source on both operating systems and running for the same duration, that the kernel version is the same (or very similar), and that the OOM killer is configured identically for both distributions. But those could also be factors.

@famigliabormidavigna
Copy link
Author

Hi tmancill

Hi @famigliabormidavigna . I maintain the Debian packaging of squeezelite and am curious to understand what is happening here. Note from your output above, you are compiling the software yourself and not using the Debian squeezelite package, for which the latest version is v1.9.9-1395.

Great! I had a feeling I was posting in the right place.
Pleae note that in the Debian repo I am using the latest squeezelite version is 1.9+git20210102.78fef68-3 and this is my /etc/apt/sources.list

deb     http://ftp.us.debian.org/debian bullseye main 
deb-src http://ftp.us.debian.org/debian bullseye main
# deb http://security.debian.org/ bullseye/updates main contrib non-free 
# deb   http://security.debian.org/debian-security bullseye-security main contrib non-free
deb     https://security.debian.org/debian-security bullseye-security main contrib non-free
# deb-src       http://security.debian.org/ bullseye/updates main contrib non-free 
deb     http://http.debian.net/debian bullseye-updates main contrib 
deb-src http://http.debian.net/debian bullseye-updates main contrib      

First a question. Why are you building for Debian but testing a different version on squeezelite for Arch Linux? Is the memory usage equivalent for squeezelite v1.9.9-1403 on Arch Linux?

Long story...
I am using, as some other hobbyst out there I suppose, a single board computer based on the ARM Kirkwood SOC, specifically a Seagate Dockstar, that was itself based on the SheevaPlug. It is an armv5tel architecture. This was supported by Arch Linux ARM https://archlinuxarm.org/ up to late february, then the developers decided to pull the plug since it was residual (and no longer in production since a few years).
I then decided not to dump all the hardware I have been lying around my house and some friends places and switch to the Linux Kernel Kirkwood package and Debian rootfs (bullseye) provided by Doozan https://forum.doozan.com/read.php?2,12096
On Arch Linux ARM the latest squeezelite versione available (to me at least) was v1.9.8-1344 and memory usage was far lower and different. Cannot install later squeezelite versions anymore from Arch Linux ARM since they removed support and their repo isn't available anymore for armv5tel. Cannot even try to compile latest version myself since might not have all the dependencies.

Assuming that the difference in memory usage is due to the OS distribution and not related to version of squeezelite, the difference could be in the libraries squeezelite is linked against. You can start by looking the version of glibc and whether you are using an alternative malloc implementation, for example, jemalloc on Arch. In order to investigate further, please provide:

1. the output of `ldd -v /usr/bin/squeezelite` for builds on both operating systems

2. the versions of both operating systems you are using

3. your build options for the Debian version so they can be compared to those used for Arch Linux

4. the output of `pmap -x $squeezelitepid` on both operating systems
  1. please find the output attached to this post, named clearly I hope
    squeezelite-alarm-ldd.txt
    squeezelite-debian-ldd.txt
    2. uname -a reports Linux squeezedock 4.4.271-1-ARCH #1 PREEMPT Sun Jun 6 01:22:10 UTC 2021 armv5tel GNU/Linux for the host with Arch Linux ARM and Linux debian 5.18.6-kirkwood-tld-1 #1.0 PREEMPT Fri Jun 24 15:26:02 PDT 2022 armv5tel GNU/Linux for the host with Doozan's Debian bullseye: is this enough? I know kernels are a universe apart, but cannot help more than this due to Arch Linux ARM no more supported on this board...
  2. On Debian I just modified this line of debian/rules from export OPTS := -DDSD -DFFMPEG -DRESAMPLE -DVISEXPORT -DLINKALL -DIR -DUSE_SSL to export OPTS := -DDSD -DFFMPEG -DRESAMPLE -DVISEXPORT -DLINKALL to remove SSL and IR thinking those might be the cause of the huge memory footprint.
  3. please find the output attached to this post, please also note that I manage to get the pmap output on Debian in the few seconds between starting squeezelite and when the OOM Killer kicks in, squeezelite never runs for more than a few seconds, 5 maybe, no more than 10 secs.
    squeezelite-alarm-pmap.txt
    squeezelite-debian-pmap.txt

I am assuming all other variables are the same - that is, you are using squeezelite to stream from the same source on both operating systems and running for the same duration, that the kernel version is the same (or very similar), and that the OOM killer is configured identically for both distributions. But those could also be factors.

Yes, the LMS Server is the same, as you can see from the command line included in pmap output. Regarding running duration, on Debian squeezelite gets killed after a few seconds after it has been started, either by systemd or by command line. Kernel version unfortunately are vastly different (Arch Linux ARM runs 4.4.271 while Debian bullseye runs 5.18.6). OOM Killer is configured as by default for both distro, no idea whether configuration is actually the same.

Thanx for your support!
Hope this helps to make squeezelite on Debian better.

Paolo.

P.S. I also installed squeezelite on an i386 box, running Debian bullseye and using stock squeezelite from the default repository: the pattern of virtual and reserved memory reported in htop are quite similar to the ones for Debian on ARM (80 MB reserved memory). Since the box has 1 GB RAM and little else running, everything runs fine, including squeezelite.

@iakovos-panourgias
Copy link

iakovos-panourgias commented Mar 18, 2023

Hi, just to mention that I've encountered the same issue (OOM killing squeezelite) on a Raspberry Pi Model B Rev 1 running Raspbian v11.6:

root@XXXXXXXXXXX:~# uname -a
Linux XXXXXXXXXXX 6.1.19+ #1637 Tue Mar 14 11:01:56 GMT 2023 armv6l GNU/Linux
root@XXXXXXXXXXX:~# lsb_release --all
No LSB modules are available.
Distributor ID:	Raspbian
Description:	Raspbian GNU/Linux 11 (bullseye)
Release:	11
Codename:	bullseye
root@XXXXXXXXXXX:~# cat /etc/debian_version 
11.6

The version that triggers the OOM was the default version that I installed using apt and is Squeezelite v1.9.8-1317, Copyright 2012-2015 Adrian Smith, 2015-2021 Ralph Irving. See -t for license terms.

When I run this version I get this in the command line:

root@XXXXXXXXXXX:~# /usr/bin/squeezelite -n XXXXXXXXXXX -a ":::0" -s "192.168.0.102" -o hw:CARD=Headphones,DEV=0 -d all=debug
[22:27:20.710482] stream_init:448 init stream
[22:27:20.712887] stream_init:449 streambuf size: 2097152
[22:27:20.772480] output_init_alsa:940 init output
[22:27:20.774560] output_init_alsa:979 requested alsa_buffer: 40 alsa_period: 4 format: any mmap: 0
[22:27:20.781515] output_init_common:353 outputbuf size: 3528000
[22:27:20.784059] output_init_common:377 idle timeout: 0
[22:27:20.827816] output_init_common:425 supported rates: 192000 176400 96000 88200 48000 44100 32000 24000 22500 16000 12000 11025 8000 
Killed

And this OOM error in dmesg:

[Sat Mar 18 22:27:26 2023] squeezelite invoked oom-killer: gfp_mask=0x400dc0(GFP_KERNEL_ACCOUNT|__GFP_ZERO), order=0, oom_score_adj=0
[Sat Mar 18 22:27:26 2023] CPU: 0 PID: 1306 Comm: squeezelite Tainted: G         C         6.1.19+ #1637
[Sat Mar 18 22:27:26 2023] Hardware name: BCM2835
[Sat Mar 18 22:27:26 2023]  unwind_backtrace from show_stack+0x18/0x1c
[Sat Mar 18 22:27:26 2023]  show_stack from dump_stack_lvl+0x34/0x58
[Sat Mar 18 22:27:26 2023]  dump_stack_lvl from dump_header+0x48/0x1d8
[Sat Mar 18 22:27:26 2023]  dump_header from oom_kill_process+0x1d0/0x1e8
[Sat Mar 18 22:27:26 2023]  oom_kill_process from out_of_memory+0x258/0x354
[Sat Mar 18 22:27:26 2023]  out_of_memory from __alloc_pages+0xa88/0xd94
[Sat Mar 18 22:27:26 2023]  __alloc_pages from handle_mm_fault+0xc54/0xd94
[Sat Mar 18 22:27:26 2023]  handle_mm_fault from __get_user_pages+0x1d8/0x558
[Sat Mar 18 22:27:26 2023]  __get_user_pages from populate_vma_page_range+0x5c/0x70
[Sat Mar 18 22:27:26 2023]  populate_vma_page_range from __mm_populate+0xa0/0x1c8
[Sat Mar 18 22:27:26 2023]  __mm_populate from sys_mlockall+0x148/0x160
[Sat Mar 18 22:27:26 2023]  sys_mlockall from ret_fast_syscall+0x0/0x1c
[Sat Mar 18 22:27:26 2023] Exception stack(0xcfa6dfa8 to 0xcfa6dff0)
[Sat Mar 18 22:27:26 2023] dfa0:                   000435d4 0002d68c 00000003 00000000 00000000 00000000
[Sat Mar 18 22:27:26 2023] dfc0: 000435d4 0002d68c 00045884 00000098 bed5b88f bed5b304 00000000 00000003
[Sat Mar 18 22:27:26 2023] dfe0: 00040d24 bed5b1cc 0001dc0c b4f979dc
[Sat Mar 18 22:27:26 2023] Mem-Info:
[Sat Mar 18 22:27:26 2023] active_anon:4518 inactive_anon:439 isolated_anon:0
                            active_file:79 inactive_file:0 isolated_file:0
                            unevictable:29487 dirty:0 writeback:0
                            slab_reclaimable:2129 slab_unreclaimable:1904
                            mapped:22273 shmem:202 pagetables:367
                            sec_pagetables:0 bounce:0
                            kernel_misc_reclaimable:0
                            free:15936 free_pcp:0 free_cma:11944
[Sat Mar 18 22:27:26 2023] Node 0 active_anon:18072kB inactive_anon:1756kB active_file:316kB inactive_file:0kB unevictable:117948kB isolated(anon):0kB isolated(file):0kB mapped:89092kB dirty:0kB writeback:0kB shmem:808kB writeback_tmp:0kB kernel_stack:736kB pagetables:1468kB sec_pagetables:0kB all_unreclaimable? no
[Sat Mar 18 22:27:26 2023] Normal free:63744kB boost:0kB min:16384kB low:20480kB high:24576kB reserved_highatomic:0KB active_anon:18072kB inactive_anon:1756kB active_file:316kB inactive_file:0kB unevictable:117948kB writepending:0kB present:245760kB managed:229052kB mlocked:117948kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:47776kB
[Sat Mar 18 22:27:26 2023] lowmem_reserve[]: 0 0
[Sat Mar 18 22:27:26 2023] Normal: 226*4kB (UMEC) 213*8kB (UEC) 75*16kB (UMEC) 47*32kB (UEC) 27*64kB (UME) 15*128kB (UME) 6*256kB (UE) 2*512kB (UC) 1*1024kB (U) 3*2048kB (UEC) 11*4096kB (C) = 63744kB
[Sat Mar 18 22:27:26 2023] 22553 total pagecache pages
[Sat Mar 18 22:27:26 2023] 0 pages in swap cache
[Sat Mar 18 22:27:26 2023] Free swap  = 0kB
[Sat Mar 18 22:27:26 2023] Total swap = 0kB
[Sat Mar 18 22:27:26 2023] 61440 pages RAM
[Sat Mar 18 22:27:26 2023] 0 pages HighMem/MovableOnly
[Sat Mar 18 22:27:26 2023] 4177 pages reserved
[Sat Mar 18 22:27:26 2023] 16384 pages cma reserved
[Sat Mar 18 22:27:26 2023] Tasks state (memory values in pages):
[Sat Mar 18 22:27:26 2023] [  pid  ]   uid  tgid total_vm      rss pgtables_bytes swapents oom_score_adj name
[Sat Mar 18 22:27:26 2023] [    108]     0   108    11424      717    30720        0          -250 systemd-journal
[Sat Mar 18 22:27:26 2023] [    129]     0   129     4984      801    16384        0         -1000 systemd-udevd
[Sat Mar 18 22:27:26 2023] [    211]   103   211     5569      642    22528        0             0 systemd-timesyn
[Sat Mar 18 22:27:26 2023] [    236]   108   236     1724      502    14336        0             0 avahi-daemon
[Sat Mar 18 22:27:26 2023] [    237]     0   237     2046      372    14336        0             0 cron
[Sat Mar 18 22:27:26 2023] [    238]   104   238     1958      532    16384        0          -900 dbus-daemon
[Sat Mar 18 22:27:26 2023] [    239]   108   239     1685      237    14336        0             0 avahi-daemon
[Sat Mar 18 22:27:26 2023] [    243]     0   243     9628      867    24576        0             0 polkitd
[Sat Mar 18 22:27:26 2023] [    252]     0   252     6442      608    22528        0             0 rsyslogd
[Sat Mar 18 22:27:26 2023] [    263]     0   263     3286      731    22528        0             0 systemd-logind
[Sat Mar 18 22:27:26 2023] [    265] 65534   265     1324      425    12288        0             0 thd
[Sat Mar 18 22:27:26 2023] [    271]     0   271     2944      631    18432        0             0 wpa_supplicant
[Sat Mar 18 22:27:26 2023] [    284]     0   284     6920      336    16384        0             0 rngd
[Sat Mar 18 22:27:26 2023] [    290]     0   290    13933     1036    36864        0             0 ModemManager
[Sat Mar 18 22:27:26 2023] [    312]     0   312     3099      967    18432        0         -1000 sshd
[Sat Mar 18 22:27:26 2023] [    342]     0   342     3011      515    18432        0             0 wpa_supplicant
[Sat Mar 18 22:27:26 2023] [    391]     0   391     3625     1075    20480        0             0 sshd
[Sat Mar 18 22:27:26 2023] [    429]     0   429      715      373    10240        0             0 dhcpcd
[Sat Mar 18 22:27:26 2023] [    439]     0   439     1117      318    12288        0             0 agetty
[Sat Mar 18 22:27:26 2023] [    441]     0   441     1659      311    14336        0             0 agetty
[Sat Mar 18 22:27:26 2023] [    447]     0   447     3625     1083    20480        0             0 sshd
[Sat Mar 18 22:27:26 2023] [    450]  1000   450     3590      706    22528        0             0 systemd
[Sat Mar 18 22:27:26 2023] [    451]  1000   451     9002      651    30720        0             0 (sd-pam)
[Sat Mar 18 22:27:26 2023] [    473]  1000   473     3625      798    20480        0             0 sshd
[Sat Mar 18 22:27:26 2023] [    474]  1000   474     3625      773    20480        0             0 sshd
[Sat Mar 18 22:27:26 2023] [    475]  1000   475     2112      515    14336        0             0 bash
[Sat Mar 18 22:27:26 2023] [    476]  1000   476     2112      519    14336        0             0 bash
[Sat Mar 18 22:27:26 2023] [    504]  1000   504     3251      498    20480        0             0 sudo
[Sat Mar 18 22:27:26 2023] [    505]     0   505     3083      485    18432        0             0 su
[Sat Mar 18 22:27:26 2023] [    506]     0   506     2005      404    14336        0             0 bash
[Sat Mar 18 22:27:26 2023] [   1305]  1000  1305     1684      316    14336        0             0 dmesg
[Sat Mar 18 22:27:26 2023] [   1306]     0  1306    35737    29483   137216        0             0 squeezelite
[Sat Mar 18 22:27:26 2023] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),global_oom,task_memcg=/,task=squeezelite,pid=1306,uid=0
[Sat Mar 18 22:27:26 2023] Out of memory: Killed process 1306 (squeezelite) total-vm:142948kB, anon-rss:28844kB, file-rss:89088kB, shmem-rss:0kB, UID:0 pgtables:134kB oom_score_adj:0

Using squeezelite from SourceForge (version squeezelite-1.9.9.1419-ffmpeg-armhf.tar.gz); I get a running squeezelite which is able to connect to my LMS. The downloaded version is Squeezelite v1.9.9-1419, Copyright 2012-2015 Adrian Smith, 2015-2021 Ralph Irving. See -t for license terms and this is the output:

root@XXXXXXXXXXX:~# ./squeezelite -n XXXXXXXXXXX -a ":::0" -s "192.168.0.102" -o hw:CARD=Headphones,DEV=0 -d all=debug
[22:33:34.353777] stream_init:466 init stream
[22:33:34.357708] stream_init:467 streambuf size: 2097152
[22:33:34.388837] output_init_alsa:936 init output
[22:33:34.390999] output_init_alsa:976 requested alsa_buffer: 40 alsa_period: 4 format: any mmap: 0
[22:33:34.393028] output_init_common:360 outputbuf size: 3528000
[22:33:34.394826] output_init_common:384 idle timeout: 0
[22:33:34.420680] test_open:301 sample rate 1536000 not supported
[22:33:34.422779] test_open:301 sample rate 1411200 not supported
[22:33:34.424777] test_open:301 sample rate 768000 not supported
[22:33:34.426160] test_open:301 sample rate 705600 not supported
[22:33:34.427401] test_open:301 sample rate 384000 not supported
[22:33:34.427836] test_open:301 sample rate 352800 not supported
[22:33:34.430470] output_init_common:426 supported rates: 192000 176400 96000 88200 48000 44100 32000 24000 22500 16000 12000 11025 8000 
[22:33:34.454144] output_init_alsa:1002 memory locked
[22:33:34.456402] output_init_alsa:1008 glibc detected using mallopt
[22:33:34.460463] output_thread:685 open output device: hw:CARD=Headphones,DEV=0
[22:33:34.461406] alsa_open:354 opening device at: 44100
[22:33:34.463931] alsa_open:425 opened device hw:CARD=Headphones,DEV=0 using format: S16_LE sample rate: 44100 mmap: 0
[22:33:34.464448] alsa_open:516 buffer: 40 period: 4 -> buffer size: 1776 period size: 444
[22:33:34.467340] output_init_alsa:1028 set output sched fifo rt: 45
[22:33:34.468727] decode_init:153 init decode
[22:33:34.470135] register_dsd:908 using dsd to decode dsf,dff
[22:33:34.471815] register_ff:779 using ffmpeg to decode alc
[22:33:34.472161] register_ff:763 using ffmpeg to decode wma,wmap,wmal
[22:33:34.474094] register_faad:663 using faad to decode aac
[22:33:34.474437] register_vorbis:387 using vorbis to decode ogg
[22:33:34.475801] register_opus:332 using opus to decode ops
[22:33:34.476372] register_flac:341 using flac to decode ogf,flc
[22:33:34.478643] register_pcm:483 using pcm to decode aif,pcm
[22:33:34.479046] register_mad:423 using mad to decode mp3
[22:33:34.479308] decode_init:202 include codecs:  exclude codecs: 
[22:33:34.481450] slimproto:898 connecting to 192.168.0.102:3483
[22:33:34.489119] slimproto:937 connected
[22:33:34.491204] sendHELO:148 mac: 80:1f:02:d6:16:b4
[22:33:34.492531] sendHELO:150 cap: CanHTTPS=1,Model=squeezelite,AccuratePlayPoints=1,HasDigitalOut=1,HasPolarityInversion=1,Balance=1,Firmware=v1.9.9-1419,ModelName=SqueezeLite,MaxSampleRate=192000,dsf,dff,alc,wma,wmap,wmal,aac,ogg,ops,ogf,flc,aif,pcm,mp3
[22:33:34.503695] process:528 setd

Thanks,
Iakovos

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants