Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caught panic with message: called Result::unwrap() on an Err value: Error { repr: Os { code: 98, message: "Address already in use" } } #126

Closed
Scavendoa2 opened this issue Apr 14, 2018 · 3 comments
Labels
waiting for feedback Issues that need user feedback for e.g. log files

Comments

@Scavendoa2
Copy link

Hello everyone,

I'm having some kind of issue but i don't know what to do to fix it

I've bough an HiFiBerry DAC+ standard and i'm using LibreELEC on it.

Spotifyd won't run, and i don't know what i'm missing, or what is wrong ...

Here are my devices :

LibreELEC:~/downloads # aplay -L
null
Discard all samples (playback) or generate zero samples (capture)
default:CARD=sndrpihifiberry
snd_rpi_hifiberry_dacplus,
Default Audio Device
sysdefault:CARD=sndrpihifiberry
snd_rpi_hifiberry_dacplus,
Default Audio Device

LibreELEC:~/downloads # aplay -l
**** List of PLAYBACK Hardware Devices ****
card 0: sndrpihifiberry [snd_rpi_hifiberry_dacplus], device 0: HiFiBerry DAC+ HiFi pcm512x-hifi-0 []
Subdevices: 1/1
Subdevice #0: subdevice #0

Here is my config file :

[global]
username = xxxxxxx
password = xxxxxxx
backend = sndrpihifiberry
device = snd_rpi_hifiberry_dacplus # Given by aplay -L
mixer = PCM
volume-control = softvol # or alsa_linear, or softvol
#onevent = command_run_on_playback_event
device_name = RaspBerrySpotifydDaemon
bitrate = 96|160|320
cache_path = cache_directory
volume-normalisation = true
normalisation-pregain = -10

I'm not sure about the backend, device and volume-control values

Here's the log of the following command :

./spotifyd -c ./spotifyd.conf -v --no-daemon

[TRACE] mio::poll: [:785] registering with poller
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:523] build; num-workers=4
13:39:20 [DEBUG] tokio_reactor::background: starting background reactor
13:39:20 [INFO] Using software volume controller.
13:39:20 [DEBUG] librespot_connect::discovery: Zeroconf server listening on 0.0.0.0:0
13:39:20 [TRACE] mio::poll: [:785] registering with poller
13:39:20 [ERROR] Caught panic with message: called Result::unwrap() on an Err value: Error { repr: Os { code: 98, message: "Address already in use" } }
13:39:20 [TRACE] tokio_reactor: [:330] event Readable Token(0)
13:39:20 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
13:39:20 [DEBUG] tokio_reactor::background: shutting background reactor down NOW
13:39:20 [DEBUG] tokio_reactor::background: background reactor has shutdown
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:871] shutdown; state=State { lifecycle: 0, num_futures: 0 }
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:917] -> transitioned to shutdown
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:929] -> shutting down workers
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933] -> shutdown worker; idx=3; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=3
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=3
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933] -> shutdown worker; idx=2; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=2
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=2
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933] -> shutdown worker; idx=1; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=1
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=1
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933] -> shutdown worker; idx=0; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=0
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=0
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:986] notifying shutdown task
13:39:20 [TRACE] tokio_threadpool: [/root/.cargo/registry/src/github.jparrowsec.cn-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:851] Shutdown::poll

@Scavendoa2 Scavendoa2 changed the title [ERROR] Caught panic with message: called Result::unwrap() Caught panic with message: called Result::unwrap() on an Err value: Error { repr: Os { code: 98, message: "Address already in use" } } Apr 14, 2018
@mainrs
Copy link
Member

mainrs commented Sep 10, 2019

You probably figured it out by now or just abandoned Spotifyd. In the case you still want to help out, take a look at #247. I think it's the same issue and I posted some commands that could help identify what program is listening to the ports used by Librespot.

@mainrs mainrs added the waiting for feedback Issues that need user feedback for e.g. log files label Sep 10, 2019
@mainrs
Copy link
Member

mainrs commented Oct 3, 2019

I am closing the issue for now. Feel free to re-open it if you need help.

@mainrs mainrs closed this as completed Oct 3, 2019
@Scavendoa2
Copy link
Author

You probably figured it out by now or just abandoned Spotifyd. In the case you still want to help out, take a look at #247. I think it's the same issue and I posted some commands that could help identify what program is listening to the ports used by Librespot.

You're right. I've abandoned Spotifyd and LibreElec, I've migrated to Volumio.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
waiting for feedback Issues that need user feedback for e.g. log files
Projects
None yet
Development

No branches or pull requests

2 participants