Fat-Zer wrote: > 2016-03-25 3:58 GMT+03:00 deloptes > <deloptes@...>: >> Hi all, >> >> ... >> >> However this works in my test program >> >> std::string teststr(newItem.ascii()); >> std::cout << teststr << "\n"; >> >> and this contradicts the logic of ascii all ��� are there >> > Nope, it doesn't... see the ascii () documentation: �If a codec has > been set using QTextCodec::codecForCStrings(), it is used to convert > Unicode to 8-bit char. Otherwise, this function does the same as > latin1().� > However you generally shouldn't use ascii() unless either you are > positive that string contains only ascii chars or some over interface > accepts strictly those and you doesn't care about others... Hi, this is also how I understand the ascii(), but do you have explanation how I then see the ��� (utf?). The above was just an experiment. For the code I wrote I solved the problem by passing the c_str() to parseVCard. This passes char array and does not care about the content that much (my understanding) regards