09-16-2008, 01:05 AM
I am running srcds in a freebsd 7 jail, therefore there is no /var/run/dmesg.boot for srcds_run to use when determining the correct binary to use. i overcame this by copying dmesg.boot from the host system into the srcds directory, and editing the path to it in the srcds_run script like so:
original:
modified:
that makes it use the i686 binary, but it seems to take forever to load. the i486 binary loads very quickly so i just let it failover to the default binary and use that instead.
for the record it's an intel core2duo 2.0ghz cpu.
the game will start fine, and i can play multiple rounds on the server, but when i try to quit it seg faults and leaves a core dump, yet somehow leaves the process running.
system:
start command:
quitting:
i then hit ctrl-c to exit the restart loop, and it drops me back to the shell, but then i have this:
if i wait a while before killing the process, i start getting these messages on my tty:
when i run:
it get this output when i quit:
that leaves me with a 331MB core dump, and a debug.log, which i have attached, along with the es_tools messages and the still running process.
for quick reference i think this is the pertinent info from debug.log:
so uh, i guess my question is, what gives?
original:
Code:
elif test "FreeBSD" = `uname`; then
CPU="`grep 'CPU:' /var/run/dmesg.boot`"
FEATURES="`grep 'Features=' /var/run/dmesg.boot`"
AMD="`echo $CPU |grep AMD`"
I686="`echo $CPU |grep 686`"
SSE2="`echo $FEATURES |grep -i SSE2`"
if test -n "$AMD"; then
echo "Using AMD Optimised binary."
HL=./srcds_amd
elif test -n "$SSE2" ; then
echo "Using SSE2 Optimised binary."
HL=./srcds_i686
else
echo "Using default binary."
fi
else
echo "Using default binary."
fi
Code:
elif test "FreeBSD" = `uname`; then
CPU="`grep 'CPU:' ./dmesg.boot`"
FEATURES="`grep 'Features=' ./dmesg.boot`"
AMD="`echo $CPU |grep AMD`"
I686="`echo $CPU |grep 686`"
SSE2="`echo $FEATURES |grep -i SSE2`"
if test -n "$AMD"; then
echo "Using AMD Optimised binary."
HL=./srcds_amd
elif test -n "$SSE2" ; then
echo "Using SSE2 Optimised binary."
HL=./srcds_i686
else
echo "Using default binary."
fi
else
echo "Using default binary."
fi
that makes it use the i686 binary, but it seems to take forever to load. the i486 binary loads very quickly so i just let it failover to the default binary and use that instead.
for the record it's an intel core2duo 2.0ghz cpu.
the game will start fine, and i can play multiple rounds on the server, but when i try to quit it seg faults and leaves a core dump, yet somehow leaves the process running.
system:
Code:
%uname -a
FreeBSD srcds1.mynoc.org 7.0-RELEASE FreeBSD 7.0-RELEASE #0: Sun Feb 24 10:35:36 UTC 2008 root@driscoll.cse.buffalo.edu:/usr/obj/usr/src/sys/GENERIC amd64
Code:
%sh ./srcds_run -maxplayers 24 -secure -game cstrike -tickrate 100 +map gg_aim_shotty
quitting:
Code:
Connection to Steam servers successful.
VAC secure mode is activated.
quit
delete entity thread
Segmentation fault (core dumped)
Add "-debug" to the ./srcds_run command line to generate a debug.log to help with solving this problem
Mon Sep 15 14:30:56 UTC 2008: Server restart in 10 seconds
Mon Sep 15 14:30:58 UTC 2008: Server Quit
Code:
%ps aux | grep srcds
deus 4923 0.0 15.7 325016 324792 p2 SJ 2:30PM 0:00.00 ./srcds_i486 -maxplayers 24 -secure -game cstrike -tickrate 10
Code:
ES_TOOLS : DEBUG NOTICE : Deadlock detected, killing server process for an auto restart...
ES_TOOLS : DEBUG NOTICE : Deadlock detected, killing server process for an auto restart...
ES_TOOLS : DEBUG NOTICE : Deadlock detected, killing server process for an auto restart...
when i run:
Code:
%./srcds_run -maxplayers 24 -secure -game cstrike -tickrate 100 +map gg_aim_shotty -debug
it get this output when i quit:
Code:
Connection to Steam servers successful.
VAC secure mode is activated.
quit
delete entity thread
Segmentation fault (core dumped)
cat: hlds.5021.pid: No such file or directory
warning: A handler for the OS ABI "GNU/Linux" is not built into this configuration
of GDB. Attempting to continue with the default i386 settings.
email debug.log to linux@valvesoftware.com
Mon Sep 15 14:43:16 UTC 2008: Server restart in 10 seconds
Mon Sep 15 14:43:19 UTC 2008: Server Quit
that leaves me with a 331MB core dump, and a debug.log, which i have attached, along with the es_tools messages and the still running process.
for quick reference i think this is the pertinent info from debug.log:
Code:
----------------------------------------------
CRASH: Mon Sep 15 14:42:52 UTC 2008
Start Line: ./srcds_i486 -maxplayers 24 -secure -game cstrike -tickrate 100 +map gg_aim_shotty -debug
Core was generated by `srcds_i486'.
Program terminated with signal 11, Segmentation fault.
#0 0x35951f53 in ?? ()
#0 0x35951f53 in ?? ()
#1 0x35a6ada0 in ?? ()
#2 0x08eb94b8 in ?? ()
<snip>
#446 0x08052ee6 in _IO_stdin_used ()
#447 0x08049085 in main ()
No symbol table info available.
No shared libraries loaded at this time.
Stack level 0, frame at 0xffffb464:
eip = 0x35951f53; saved eip 0x35a6ada0
called by frame at 0xffffb468
Arglist at 0xffffb45c, args:
Locals at 0xffffb45c, Previous frame's sp is 0xffffb464
Saved registers:
eip at 0xffffb460
End of Source crash report
----------------------------------------------
so uh, i guess my question is, what gives?