Why does C# host application change long double precision of C++ dll?

EDN Admin

Well-known member
Joined
Aug 7, 2010
Messages
12,794
Location
In the Machine
<div id="x_post_message_3579219
<div style="margin-right:65px Hello all,<br/>
<br/>
I developed dll in Intel C++ that applies long double arithmetic operations. <br/>
My dll allocates 128 bit for each long double variable and performs 19 digits precision operations if I use C++ or Delphi host application.
<br/>
In case of C# host application the dll allocates 128 bit also but number of significant digits is reduced to 15 (same with double).
<br/>
How does it possible and what do I have to do to increase a number of significant digits to 19?<br/>
<br/>
Thanks.

<img id="x_progress_3579219" src="http://social.msdn.microsoft.com/images/misc/progress.gif" alt="" style="
<a name="x_vB::QuickEdit::3579219 http://bytes.com/editpost.php?do=editpost&p=3579219
<span style="font-size:x-small; color:#2a5db0; background-color:#eaeaea edit
<a id="x_qr_3579219 http://bytes.com/newreply.php?do=newreply&p=3579219 <span style="font-size:x-small; color:#2a5db0; background-color:#eaeaea reply
<span style="font-size:x-small; color:#2a5db0; background-color:#eaeaea report


View the full article
 
Back
Top