Dumps obtained from Windows Error Reporting typically have a useless current context set on the faulting thread, with a stack deep in WerpReportFault
. The actual context at the time of the exception can be retrieved with .ecxr
-- which also sets the context in such a way that subsequent commands on the same thread (such as k
) return the "correct" information.
I am building a tool for automatic dump analysis, which uses IDebugControl::GetStackTrace
to obtain the faulting thread's stack. I can retrieve the stored exception context using IDebugControl4::GetStoredEventInformation
. If I use the EBP/RBP, ESP/RSP, EIP/RIP values from the stored context with GetStackTrace
, I get the correct stack. However, I would much rather replicate what the .ecxr
command does, setting the "correct" state until the thread is switched. I tried using IDebugAdvanced::SetThreadContext
, but it seems to be an illegal operation for dump targets and fails with 0x8000FFFF.
I tried to figure out what .ecxr
does by debugging a WinDbg instance, and it looks like .ecxr
is implemented in dbgeng!DotEcxr
. However, from tracing it (with wt
) I wasn't able to understand how it resets the current thread's context. It doesn't seem to call any of the COM debug-client interface methods, anyway, and doesn't use IDebugAdvanced::SetThreadContext
.
Any suggestions on how to set the thread context in a dump file would be much appreciated. As a last resort, I can always use IDebugControl::Execute
and simply invoke the .ecxr
command, but I'd prefer a more programmatic approach.