Heap Corruption but only when compiled on laptop
Asked Answered
G

3

10

I am trying to compile a program that compiles perfectly fine on my desktop but on my laptop, it compiles but gives me this error whenever it is run:

Windows has triggered a breakpoint in RR.exe.

This may be due to a corruption of the heap, which indicates a bug in RR.exe or any of the DLLs it has loaded.

This may also be due to the user pressing F12 while RR.exe has focus.

The output window may have more diagnostic information.

I've commented out lines till I found the line that makes the error which is:

if(glfwOpenWindow(width_, height_, 0, 0, 0, 0, 32, 0, GLFW_WINDOW) != GL_TRUE) {
    throw std::runtime_error("Unable to open GLFW window");
}

The weird thing is if I replace width_ and height_ with constants, e.g. 800 and 600 respectively, it stops the heap corruption. Also if I just use the default values set by the constructor instead of passing values it doesn't crash.

Here's the complete code. The above lines are in the Window constructor.

window.h

#pragma once

#include <iostream>
#include <GL\glew.h>
#include <GL\glfw.h>

#pragma comment(lib, "opengl32.lib")
#pragma comment(lib, "glu32.lib")
#pragma comment(lib, "glew32.lib")
#pragma comment(lib, "GLFW.lib")

class Window {
public:
    Window(unsigned width = 800, unsigned height = 600);
    ~Window();

    void clear();
    inline void display() { glfwSwapBuffers(); }
    inline bool exit() { return !glfwGetWindowParam(GLFW_OPENED); }

private:
    unsigned width_, height_;
};

window.cpp

#include "window.h"

Window::Window(unsigned width, unsigned height) : width_(width), height_(height) {
    if(glfwInit() != GL_TRUE) {
        throw std::runtime_error("Unable to initialize GLFW");
    }

    if(glfwOpenWindow(width_, height_, 0, 0, 0, 0, 32, 0, GLFW_WINDOW) != GL_TRUE) { //crash
    //if(glfwOpenWindow(800, 600, 0, 0, 0, 0, 32, 0, GLFW_WINDOW) != GL_TRUE) { //no crash
        throw std::runtime_error("Unable to open GLFW window");
    }

    GLenum result = glewInit();
    if(result != GLEW_OK) {
        std::stringstream ss;
        ss << "Unable to initialize glew: " << glewGetErrorString(result);
        throw std::runtime_error(ss.str());
    }
}

Window::~Window() {
    glfwTerminate();
}

void Window::clear() {
    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
    glLoadIdentity();
}

main.cpp

#include "window.h"

int main() {
    Window wind(1024, 800); //crash
    Window wind(800, 600); //crash
    Window wind(); //works

    return 0;
}
Glosseme answered 19/3, 2012 at 11:42 Comment(12)
+1 for supplying a complete, relatively-short test-case.Trichinopoly
Have you ensured you have the exact same versions of system/runtime DLLs on both machines?Iphagenia
They are the exact same DLLs, .libs and project as I keep them on dropbox.Glosseme
What about the output window ? I think, this is not heap related...Kraemer
What are the requirements that the library places on the hardware? Does the hardware in the laptop comply with those? (i.e. in a quick look it seems that it requires OpenGL 3.x, does the hardware in your laptop support it?)Ambient
Window wind(); - this is a function declaration, not a default constructed variable of type WindowEelworm
@Eelworm Does that not use the default values set in the constructor?Glosseme
Are you sure the line with glfwOpenWindow causes the crash and not just exposes it? Do width_ and height_ contain valid and correct values at the time of call?Phebe
@Rarge: to default-construct a variable of type Window you just type Window wind;Phebe
@Rarge: the constructor isn't called at all in this case. You simply declare a function with signature Window wind(void) and do nothing with it. This explains the weird behaviour: all cases actually do not work on your laptop; the case Window wind() simply does nothing and therefore doesn't cause the crash.Eelworm
Perhaps the function is just failing and you throw an exception. Since nothing is catching that exception it will just manifest as a normal crash.Dorotheadorothee
I've managed to stop the heap corruption by changing the Runtime Library from Multi-threaded Debug DLL to Multi-threaded DLL. I will post it as an answer when I can.Glosseme
F
6

The problem seems lead with glfw:

I assume, you are trying to use dynamically linked GLFW. Note in glfw header:

#if defined(_WIN32) && defined(GLFW_BUILD_DLL)

/* We are building a Win32 DLL */
 #define GLFWAPI      __declspec(dllexport)
 #define GLFWAPIENTRY __stdcall
 #define GLFWCALL     __stdcall
#elif defined(_WIN32) && defined(GLFW_DLL)

 /* We are calling a Win32 DLL */
 #if defined(__LCC__)
  #define GLFWAPI      extern
 #else
  #define GLFWAPI      __declspec(dllimport)
 #endif
 #define GLFWAPIENTRY __stdcall
 #define GLFWCALL     __stdcall

#else

 /* We are either building/calling a static lib or we are non-win32 */
 #define GLFWAPIENTRY
 #define GLFWAPI
 #define GLFWCALL

#endif

GLFW_BUILD_DLL apparently was set while building dll, and it defined API functions with __stdcall calling conversion.

But when using library you haven't defined GLFW_DLL, so your code assumed __cdecl calling conversion. The difference between _cdecl and __stdcall in general is that caller function should clean the stack in first and callee in last case. So you cleaned the stack twice, that's why you got stack corruption.

After I defined GLFW_DLL before including glfw in your program, it started working correctly. Also note, that I used mingw and had to link against glfwdll.a instead of glfw.a after defining GLFW_DLL.

Formally answered 20/3, 2012 at 8:29 Comment(2)
This also worked for me. I managed to also get it working by changing the Runtime Library for Multi-threaded Debug DLL to Multi-threaded DLL. Would you be able to explain why passing constant values instead of variables didn't cause the heap corruption? Does it call the function differently seeming as all arguments are constant?Glosseme
@Rarge, "Runtime Library for Multi-threaded Debug DLL to Multi-threaded DLL" seems to be connected with some WinCRT stuff, not GLFW. Neither setting constant values or changing project settings prevents heap corruption, but only hides it.Formally
P
1

Heap corruption bugs almomst never manifest at the point they originally occur, which is what makes them so painful to diagnose. The fact that it works on one system and not another implies undefined behavior.

I didn't see any obvious bugs in a quick inspection of your code. If you have access to Purify for Windows, or alternate the ability to compile on Linux you could use valgrind. Either of these tools will have a much higher change of success than simple code inspection I believe.

Preoccupation answered 19/3, 2012 at 13:25 Comment(0)
G
0

Another solution I came across:

By changing the Runtime Library (Project Properties > C/C++ >from Multi-threaded Debug DLL (/MDd) to Multi-threaded DLL (/MD) the heap corruption no longer occurred.

I don't know why though, perhaps someone with more knowledge can could shed some light on this.

Glosseme answered 20/3, 2012 at 13:10 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.