SM64 IRIX - GL Linking error

panther5482

New member
Apr 5, 2019
11
0
1
NOTE: I initially posted this to a different forum, I figured I'd copy the original post but there are some addendums at the bottom.

Recently the SM64 PC port announced back in May has gotten some additions and tweaks in the form of a fork called sm64ex, one of them being a backend that uses legacy OpenGL (1.1) instead of the original which I believe needed 2.0. I decided it would be fun to try and compile it for IRIX on my O2.

Well, I ran into a couple errors in my adventure so far, but I have been able to resolve all of them reasonably except for this one I encountered at linking time:
Code:
/usr/sgug/bin/ld: build/us_pc/src/pc/gfx/gfx_opengl_legacy.o: undefined reference to symbol 'glColor3f'
/usr/sgug/bin/ld: note: 'glColor3f' is defined in DSO //usr/lib32/libGLcore.so so try adding it to the linker command line
//usr/lib32/libGLcore.so: could not read symbols: Invalid operation
collect2: error: ld returned 1 exit status
make: *** [Makefile:990: build/us_pc/sm64.us.f3dex2e] Error 1
As far as I know, the default LDFLAGS for GL specifically in the project are just -lGL. sdl-config pulls in some libs of its own but AFAIK none of it has to do with opengl. So reading the error I figured I'd try adding GLCore. Well, this is what happens in that case:
Code:
/usr/lib32/../lib32/libGL.so: undefined reference to `XSGIMiscQueryExtension'
/usr/lib32/../lib32/libGL.so: undefined reference to `_XSGISetSpecialDestroyHandler'
/usr/lib32/../lib32/libGL.so: undefined reference to `_XSGISetSpecialDestroyInterest'
/usr/lib32/libGLcore.so: undefined reference to `gl_INTERPRET_END'
collect2: error: ld returned 1 exit status
make: *** [Makefile:990: build/us_pc/sm64.us.f3dex2e] Error 1
No re-ordering of the linker flags, or specifying the DSOs manually by their full paths, has gotten rid of both of these errors. It seems like whenever glCore is present, the latter error happens, and when it isn't, the first one always appears. Googling tells me very little about the symbols in question for the second error, and I'm thinking the reason why is because those are internal symbols that GL and GLcore use to talk to each other, and not something that's exposed. Either that or, as the name of the function suggests, they're just SGI specific extensions that very few software has used and as a result there's nothing on the internet for them. I'm thinking that glCore is not meant to ever be used directly in linker flags, but I'm still not really sure what's causing the first error. Maybe my GL setup is broken on my machine? It says it can't read the symbols from GLcore, but if I specify it explicitly, it clearly understands the file and I know it isn't corrupt. That's why I wanted to ask here, since I was wondering if anybody had seen the same type of issues when trying to compile GL programs.

FYI, this uses SGUG-RSE 0.0.6, so GCC and GNU LD.

Addendum 1: Firstly, the second error looks like it's a result of the fact that libGLcore is not supposed to be invoked on the command line like this, because it's linked directly to libGL and should automatically be picked up. And after some playing around with nm and ldd a little bit, and being unable to find gl_INTERPRET_END anywhere in the GL libraries on my system, I'm thinking that GNU ld might be the problem here. I would try IRIX ld but I can't, at least not with what I currently have because the rest of the program was compiled with GCC and GCC performs LTO, meaning the object files are pretty meaningless to a non-GNU linker (afaik). And I would re-compile with MIPSpro, but I doubt it will compile without a huge struggle given the large codebase. The SM64 pc port also relies on tools that are only in SGUG like python3 so my path might get a little wonky. So I'm trying to stick with just SGUG-RSE, but it's proving hard. The reason in particular I'm leaning towards it being a GNU ld problem is that the GNU nm that comes with the same binutils refuses to read any of my system libraries, it just says that no symbols are found. Maybe one of the RSE devs knows something about this? I think it would be helpful to try and compile a simple GL demo, but I'm not a programmer or anything so I haven't been able to find anything that doesn't rely on an external dep not included in IRIX by default like GLUT. One should just be able to grab the opengl context using Xlib+GLX, Motif or even SDL since I have that installed, and that would narrow down my issue significantly.

Also, I'm thinking of trying Nekoware GCC, it should be a little less incompatible than MIPSPro despite being a little older but more stable than RSE.
 

Elf

Storybook
Feb 4, 2019
355
80
28
So I did some digging... First you will be happy to know that you are not the first to run into this issue! @gijoe77, @onre, and @hammy all have run into the exact same thing before, though unfortunately I did not find any quick resolution. I have to think that at least some OpenGL apps work though since we have working 3D in SDL, which at least gives promise. I also know @Northsky has written his own OpenGL demos, I think on IRIX? He may be able to advise how he compiled them.

The most helpful thing I found was from @hammy who suggested:
Sorry not yet, no. (the gl_INTERPRET_END issue, I guess). This is possibly another one of those magical symbols that irix ld knows are filled in at runtime.
Also worth noting that libgl.so (as distinct from libGL.so) appears to reference the undefined gl_INTERPRET_END.

You might try a simple program with IRIX ld and see if it just works?

You can certainly try old Nekoware GCC and ld, but I would expect more of a headache from trying to mix those with newer GCC 9 compiled libraries than there would be any benefit. Since other people have seen this error going back a ways before RSE (2019/06), I would guess it is not caused by anything new in the toolchain.

Edit 1: Edited to remove some erroneous information about where gl_INTERPRET_END is.

Edit 2: Also if it helps, for reference, here is a list of libraries used by one of the IRIX GL demos. You might compare it against your libraries. Worth noting that libGLcore is included.

atlantis.png


Edit 3: I've also attached the OpenGL porting and programming guides, which list link options and example code. Nothing revolutionary, unfortunately, but at least you can get it straight from the authorities.

Link Lines for Individual Libraries
This sections lists link lines and the libraries that will be linked.
  • –lGL OpenGL and GLX routines.
  • –lX11 Xlib, X client library for X11 protocol generation.
  • –lXext The X Extension library provides infrastructure for X client-side libraries (like OpenGL).
  • –lGLU OpenGL utility library.
  • –lXmu Miscellaneous utilities library (includes colormap utilities).
  • –lXt X toolkit library, infrastructure for widgets.
  • –lXm Motif widget set library.
  • –lGLw OpenGL widgets, Motif and core OpenGL drawing-area widgets.
  • –lXi X input extension library for using extra input devices.
  • –limage RGB file image reading and writing routines. The image library is only supported under IRIX. Open source alternatives like libjpeg and libpnm provide image I/O functions and are better alternatives when writing code that must also run on Linux and other platforms.
  • –lm Math library. Needed if your OpenGL program uses trigonometric or other special math routines.

Link Lines for Groups of Libraries
To use minimal OpenGL or additional libraries, use the following link lines:
  • Minimal OpenGL –lGL –lXext –lX11
  • With GLU –lGLU
  • With Xmu –Xmu
  • With Motif and OpenGL widget –lGLw –lXm –lXt
 

Attachments

Last edited:
  • Like
Reactions: gijoe77 and foetz

gijoe77

New member
Feb 18, 2019
12
4
3
so looking over my notes, regarding the "gl_INTERPRET_END" error, the way I get around it (for gcc and GNU ld) is I used the following linker flag:

"--allow-shlib-undefined"

I think the default behavior with IRIX's ld is equivalent to GNU ld's --allow-shlib-undefined

regarding the other errors, I have not run across them yet
 
  • Like
Reactions: Elf

panther5482

New member
Apr 5, 2019
11
0
1
so looking over my notes, regarding the "gl_INTERPRET_END" error, the way I get around it (for gcc and GNU ld) is I used the following linker flag:

"--allow-shlib-undefined"

I think the default behavior with IRIX's ld is equivalent to GNU ld's --allow-shlib-undefined

regarding the other errors, I have not run across them yet
Thank you!! This did the trick. Unfortunately, however, as is usually the case with these projects, the linked binary now segfaults. It seems to get through a fair amount of the initialization process before finally segfaulting when trying to init the display. Here's the backtrace from GDB, if you have any clues. I'll probably have to do some digging thru the codebase. I have some evidence that SDL works on my system because I compiled CaveStory and that works fine, although it doesnt use GL.
error.png
 

hammy

Active member
Jun 1, 2019
107
61
28
UK
Hi panther,

Pretty much what @Elf and @gijoe77 said - the issue we've seen is there are certain magical symbols that IRIX ld won't mind are unresolved.

For GNU ld (what's in rse) - you'll have to manually tell the linker/compiler you don't mind the individual symbol unresolved or don't care about all unresolved symbols.

So that's the linker issue - but there's deeper issues with what you are attempting, I'm afraid.

-> The main problem I see is that we've not yet got a good story/approach for sgug-rse and the system accelerated GL libs.

As of 0.0.6 - we've got a "new" up to date libX11 client layer - and that new client layer is what you'll be linking against probably (we use RPATH in the RSE libraries to ensure we're not interfering with system applications).

This is a problem, currently, because some IRIX libraries are expecting an IRIX X11 library that has symbols we don't have in the "new" X11 client libs. GL, the standard Motif libraries have these issues.

"Can't I just use the IRIX libraries (but still with RSE)" - you might have some luck by explicitly linking against the IRIX X11 libs + GL, but I've not done that, others haven't yet done that, so be aware you're trailblazing at that point. (And personally I'd prefer we have a solution for running our new X11 libs with the SGI libGL).

Some of your points:

Q) "It can't read symbols from GLcore"
A) GNU nm will by default try to read the "normal" symbols from a shared object. IRIX is a special child and only stores things in the dynamic symbol table. Use `nm -D /usr/lib32/libGLcore.so`

Q) "because the rest of the program was compiled with GCC and GCC performs LTO"...
A) While you are trying to get things to compile and run, don't use LTO - I'd advise to use `-g -Og` so you have at least some symbols and no LTO magic going on.

Q) Will nekoware GCC be better?
A) You may indeed have better luck with nekoware gcc - as then you won't be pulling in the RSE more recent X11 libs but the system libGL stuff.

Q) Will nekoware GCC be more stable?
A) Hehe I'm a little biased, but nekoware GCC and binutils are missing a _lot_ of fixes we've pushed into the RSE - but yeah, by all means give that a poke.

Kr,

Hammy
 
  • Like
Reactions: Elf

panther5482

New member
Apr 5, 2019
11
0
1
Thanks hammy. I read what you said about the new libX11 client layer in RSE 0.0.6 and I'm thinking it would be easier to just link against the system X11 libs and bypass the defaults like you said. Do you know what I should or can do in that case? what I should pass to ld maybe? And how would I know which libs I'm using?
 

panther5482

New member
Apr 5, 2019
11
0
1
"Can't I just use the IRIX libraries (but still with RSE)" - you might have some luck by explicitly linking against the IRIX X11 libs + GL, but I've not done that, others haven't yet done that, so be aware you're trailblazing at that point. (And personally I'd prefer we have a solution for running our new X11 libs with the SGI libGL).
Update - I uninstalled the X11 RPMs, re-linked the binary and now I have an error from SDL about an invalid GLX visual. Here it is:

Code:
FATAL ERROR:
could not set video mode [640x480 fullscreen 0]: Couldn't find matching GLX visual
Google says there's an env var to manually specify this value based on what you see from glxinfo. But I've tried setting it, and it's made no difference - also tried changing the default resolution in the config, and enabled/disabled fullscreen. Still no change. So I'm thinking it's some problem with GLX and my own setup, could be one of the issues you said might happen using system X11 libs with RSE 0.0.6. Let me know if you have ideas.
 

hammy

Active member
Jun 1, 2019
107
61
28
UK
I read what you said about the new libX11 client layer in RSE 0.0.6 and I'm thinking it would be easier to just link against the system X11 libs and bypass the defaults like you said. Do you know what I should or can do in that case? what I should pass to ld maybe? And how would I know which libs I'm using?
Yep I think (uninstall RPMs) that's probably the sensible approach. Leaving the RSE X11 stuff there is going to be a world of pain. RSE has "features" that mean GCC will hunt for libraries and headers under /usr/sgug first before checking the system directories. That's a pain for what you are attempting.

You might also want to avoid using/setting LD_LIBRARYN32_PATH so that configure scripts aren't "discovering" libraries you don't want.

Care: Make sure to catch all the X11 libs - e.g. Xext, Xi etc - `ldd /usr/sbin/ivview` gives a good starter list to check.

Note: As mentioned - to see what libraries a binary would use at runtime, `ldd` - e.g. ldd /usr/bin/man

Update - I uninstalled the X11 RPMs, re-linked the binary and now I have an error from SDL about an invalid GLX visual. Here it is:

Code:
FATAL ERROR:
could not set video mode [640x480 fullscreen 0]: Couldn't find matching GLX visual
Google says there's an env var to manually specify this value based on what you see from glxinfo. But I've tried setting it, and it's made no difference - also tried changing the default resolution in the config, and enabled/disabled fullscreen. Still no change. So I'm thinking it's some problem with GLX and my own setup, could be one of the issues you said might happen using system X11 libs with RSE 0.0.6. Let me know if you have ideas.
If you can, I'd start from the simplest GLX application you can + get that building. Verify it's linking against the libraries you want it to - and then re-produce the error you see from the larger project you're attempting. I'd maybe even go as far as doing that simple application purely with MIPSPro - so you can see which libraries are needed.

That might yield enough info to discover where your prob is coming from.

Sorry I can't help more, I'm focused on dnf/microdnf so we have a natural update mechanism - please do let us know how you get on and what you discover, even if it's "X and Y didn't work" - that's very handy info too!

Best of luck.

H
 
  • Like
Reactions: panther5482 and Elf

joostp

New member
Oct 5, 2020
4
1
3
sm64ex requires OpenGL 1.3 (or 1.2 with the GL_ARB_multitexture and GL_ARB_texture_env_combine extensions).

According to glxinfo my Fuel with a V10 on IRIX 6.5.30 has OpenGL 1.2 and neither extension.

When I run sm64ex here (compiled on a pc using a gcc cross-compiler) it errors out with the following:

Code:
GL extension not supported: GL_ARB_multitexture
X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  143 (DOUBLE-BUFFER)
  Minor opcode of failed request:  1 (DBEAllocateBackBufferName)
  Serial number of failed request:  25
  Current serial number in output stream:  26
FATAL ERROR:
required GL extensions are not supported
So unless someone changes the code to not require these extensions it's probably not going to run.
 

panther5482

New member
Apr 5, 2019
11
0
1
sm64ex requires OpenGL 1.3 (or 1.2 with the GL_ARB_multitexture and GL_ARB_texture_env_combine extensions).

According to glxinfo my Fuel with a V10 on IRIX 6.5.30 has OpenGL 1.2 and neither extension.

When I run sm64ex here (compiled on a pc using a gcc cross-compiler) it errors out with the following:

Code:
GL extension not supported: GL_ARB_multitexture
X Error of failed request:  BadMatch (invalid parameter attributes)
  Major opcode of failed request:  143 (DOUBLE-BUFFER)
  Minor opcode of failed request:  1 (DBEAllocateBackBufferName)
  Serial number of failed request:  25
  Current serial number in output stream:  26
FATAL ERROR:
required GL extensions are not supported
So unless someone changes the code to not require these extensions it's probably not going to run.
Are you using nightly branch with the GL_legacy option in your makefile command? Remember that this has been confirmed to run on NT3.51, NT4, Win95, and Win98 using OpenGL, and as far as I know those don't support higher than 1.2 anyway.

That isn't even the error I'm getting anyway, but I'm curious if you manage to get past the point I'm at and what you've done to fix it. The line of code causing my error is here: https://github.com/sm64pc/sm64ex/blob/c1ed30a2faf07d289ffdbbe3e4208eba236fa13a/src/pc/gfx/gfx_sdl1.c#L135 (this is a link to a line in the file, line 135). Maybe if you add an else statement to that, with a printf saying "No Visual mode error" you'd know that the code gets past that point (assuming you have the SDL1 backend enabled).

Also, I'd be interested in knowing your compiler setup so we can continue to exchange build notes.
 

joostp

New member
Oct 5, 2020
4
1
3
Ah no, I wasn't aware of the nightly branch and was building master (with GL_LEGACY set and using SDL2), so this is where it got to: https://github.com/sm64pc/sm64ex/blob/master/src/pc/gfx/gfx_opengl_legacy.c#L531

I'll try building the nightly branch tonight and see if I get the same result as you.

As for my setup, I'm using an old gcc 4.4.1 cross-compiler on linux that I built many years ago. I should probably upgrade it at some point but it's served me well building everything that I use on IRIX.
 

joostp

New member
Oct 5, 2020
4
1
3
The nightly branch of sm64ex builds and works here (both SDL1 and SDL2 versions) albeit with some visual camera/clipping issues during gameplay, perhaps these are due to this old compiler or some build flag I need to adjust... I'll have a look.

Here's a screenshot of it running on my Fuel:

sm64ex_01.png


My toolchain is pretty standard, just gcc 4.4.1 targeting mips-sgi-irix6.5 and built with headers and libs that I copied across from my Octane at the time. I'm statically linking as much as possible, including SDL.
This is the output from ldd on the sm64ex SDL1 build:

Code:
$ ldd sm64.us.f3dex2e
        libGL.so  =>     /usr/lib32/libGL.so   
        libpthread.so  =>        /usr/lib32/libpthread.so       
        libaudio.so  =>  /usr/lib32/libaudio.so
        libmd.so  =>     /usr/lib32/libmd.so   
        libfastm.so  =>  /usr/lib32/libfastm.so
        libm.so  =>      /usr/lib32/libm.so     
        libc.so.1  =>    /usr/lib32/libc.so.1   
        libGLcore.so  =>         /usr/lib32/libGLcore.so       
        libXsgivc.so  =>         /usr/lib32/libXsgivc.so       
        libXext.so  =>   /usr/lib32/libXext.so 
        libX11.so.1  =>  /usr/lib32/libX11.so.1

The only changes I made to the sm64ex source were to comment out the STATIC_ASSERT macros and some #pragmas which this old gcc doesn't like, and some minor edits like removing the anonymous union from pc/configfile.c , and in the build options I have EXT_OPTIONS_MENU set to 0 (for similar reasons).

Anyway, since the error you're now encountering is at runtime, the info above perhaps isn't that useful.
I would check the output of glxinfo to see whether your O2 really supports the video mode that sm64ex is trying to init, and to try changing the SDL_GL_SetAttribute() calls at lines 109-114 based on your glxinfo output to see if that makes any difference.
 
  • Like
Reactions: panther5482

panther5482

New member
Apr 5, 2019
11
0
1
The nightly branch of sm64ex builds and works here (both SDL1 and SDL2 versions) albeit with some visual camera/clipping issues during gameplay, perhaps these are due to this old compiler or some build flag I need to adjust... I'll have a look.

Here's a screenshot of it running on my Fuel:

View attachment 685

My toolchain is pretty standard, just gcc 4.4.1 targeting mips-sgi-irix6.5 and built with headers and libs that I copied across from my Octane at the time. I'm statically linking as much as possible, including SDL.
This is the output from ldd on the sm64ex SDL1 build:

Code:
$ ldd sm64.us.f3dex2e
        libGL.so  =>     /usr/lib32/libGL.so  
        libpthread.so  =>        /usr/lib32/libpthread.so      
        libaudio.so  =>  /usr/lib32/libaudio.so
        libmd.so  =>     /usr/lib32/libmd.so  
        libfastm.so  =>  /usr/lib32/libfastm.so
        libm.so  =>      /usr/lib32/libm.so    
        libc.so.1  =>    /usr/lib32/libc.so.1  
        libGLcore.so  =>         /usr/lib32/libGLcore.so      
        libXsgivc.so  =>         /usr/lib32/libXsgivc.so      
        libXext.so  =>   /usr/lib32/libXext.so
        libX11.so.1  =>  /usr/lib32/libX11.so.1

The only changes I made to the sm64ex source were to comment out the STATIC_ASSERT macros and some #pragmas which this old gcc doesn't like, and some minor edits like removing the anonymous union from pc/configfile.c , and in the build options I have EXT_OPTIONS_MENU set to 0 (for similar reasons).

Anyway, since the error you're now encountering is at runtime, the info above perhaps isn't that useful.
I would check the output of glxinfo to see whether your O2 really supports the video mode that sm64ex is trying to init, and to try changing the SDL_GL_SetAttribute() calls at lines 109-114 based on your glxinfo output to see if that makes any difference.
Wow, nice!
Well, seems like you have quite a different setup as mine, since I'm using a lot of prebuilt software like the compiler from SGUG-RSE and SDL from nekoware. I've been thinking that mixing and matching has been my issue, but I really don't know.
And maybe it is because of those SetAttribute calls, I missed that initially. Thanks!
 

panther5482

New member
Apr 5, 2019
11
0
1
That seems to have been the issue. Here it is running on my O2:
mario64.png

I don't have any camera glitches like you're talking about, but it is a little slow, not to mention that the window seems to be running at the wrong size. That's probably a minor fix though. Also there are a couple bugs in the transitions between levels, this could be because I had to disable the alpha channel in the GLX visual.
 

joostp

New member
Oct 5, 2020
4
1
3
Cool, glad you got it working too.

The clipping issues I'm seeing look depth buffer related, as I don't see them on another machine with a different graphics board. I recall having similar issues in other 3d games in the past with this V10, which is a shame, as the sm64ex performance is great compared to the slideshow I get on my Octane with ESI graphics. What kind of framerate are you getting on the O2?

I do also see the transition issue that you mention. It looks like half of the mask texture is (not) flipped.

sm64ex_transition_issue.png


Is that what you're seeing as well? Shadows seem to have the same issue with texture flipping not working properly.
 

Elf

Storybook
Feb 4, 2019
355
80
28
FWIW the slideshow on the ESI graphics is likely related to texturing. When the cards don't have any TRAM installed, anything with texturing (incl. the pack-in SGI demos) runs terribly slow! With TRAM I would think the performance on something like this should be decent.
 

panther5482

New member
Apr 5, 2019
11
0
1
Cool, glad you got it working too.

The clipping issues I'm seeing look depth buffer related, as I don't see them on another machine with a different graphics board. I recall having similar issues in other 3d games in the past with this V10, which is a shame, as the sm64ex performance is great compared to the slideshow I get on my Octane with ESI graphics. What kind of framerate are you getting on the O2?

I do also see the transition issue that you mention. It looks like half of the mask texture is (not) flipped.

View attachment 687

Is that what you're seeing as well? Shadows seem to have the same issue with texture flipping not working properly.
Yes, that's what I see as well. And to put it shortly, I'd say it runs like ass. 15-20fps. I imagine there is *some* optimization that can be done here to make it more conducive to the specifics of SGI hardware (like those without TRAM, as elf mentioned)? I don't have that kind of skill though hehe
 

panther5482

New member
Apr 5, 2019
11
0
1
On second thought, my O2 doesn't really need texture RAM to be fast (theoretically, or at least faster than SGI graphics without TRAM), since it shares memory, so I'm wondering if the culprit is something else - I feel like the game shouldn't be that hard to run.
 

drmadison

New member
Jun 30, 2020
16
9
3
That star wipe looks like a mis-set texparam. Looks like the code is trying to use GL_MIRRORED_REPEAT for the texture wrap parameter which didn't exist in OpenGL 1.2/1.3. Unsure there's much to be done about that.

Looking forward to digging into this a bit this weekend, should be fun!
 
  • Like
Reactions: Elf

About us

  • Silicon Graphics User Group (SGUG) is a community for users, developers, and admirers of Silicon Graphics (SGI) products. We aim to be a friendly hobbyist community for discussing all aspects of SGIs, including use, software development, the IRIX Operating System, and troubleshooting, as well as facilitating hardware exchange.

User Menu