I have an vector<BYTE>
that represents characters in a string. I want to interpret those characters as ASCII characters and store them in a Unicode (UTF-16) string. The current code assumes that the characters in the vector<BYTE>
are Unicode rather than ASCII. This works fine for standard ASCII, but fails for extended ASCII characters. These characters need to be interpreted using the current code page retrieved via GetACP()
. How would I go about creating a Unicode (UTF-16) string with these ASCII characters?
EDIT: I believe the solution should have something to do with the macros discussed here: http://msdn.microsoft.com/en-us/library/87zae4a3(v=vs.80).aspx I'm just not exactly sure how the actual implementation would go.
int ExtractByteArray(CATLString* pszResult, const CByteVector* pabData)
{
// place the data into the output cstring
pszResult->Empty();
for(int iIndex = 0; iIndex < pabData->GetSize(); iIndex++)
*pszResult += (TCHAR)pabData->GetAt(iIndex);
return RC_SUCCESS;
}
CString
to do it automatically? – Kimberlivector<BYTE>
that represents characters in a string." - why notstd::string
? – BathesdaCString
constructors: msdn.microsoft.com/en-US/library/cws1zdt8(v=vs.110).aspx Just use the one that receives a pointer tochar
and a length, and your job is done. – KimberliCByteVector
is implemented in such a way that I cannot get a pointer to the first object, just the value. – UlulantCString
is broken. But I'm confused. In the question you said you hadvector<BYTE>
. I've no enthusiasm to help if we can't even work out what the question is. – Kimberli