Delphi Data type too large: exceeds 2 GB in Berlin Update 2
Asked Answered
P

2

9

I have a unit which is for both Delphi & Lazarus. In Lazarus the unit compiled without any exception but in Delphi it gives me Error Data type too large: exceeds 2 GB. Below is the code:

unit UType;

{$ifdef FPC}
 {$MODE delphi}{$H+}
{$endif}

interface


type
  TFreqType = Extended;

  TFreqCutArray = Array [0..0]of TFreqType;

  PFreqCutArray = ^TFreqCutArray;

  FilterOrder = Integer;

  TAS_Sample = Extended;

  TAS_SampleArray = Array[0..High(Integer) div Sizeof(TAS_Sample) - 1] of TAS_Sample;

  PTAS_SampleArray = ^TAS_SampleArray;

  TAS_Float = Extended;

  TComplex = record
    Re, Im: TAS_Sample; // Z = Re + i*Im
    end;

  PComplex = ^TComplex;
  TComplexArray = Array[0..High(Integer) div Sizeof(TComplex) - 1] of TComplex;//here Delphi gives the error

  PComplexArray = ^TComplexArray;
  FilterProc = function(V: TAS_Sample): TAS_Sample of object;

implementation

end.

I am using Berlin Update 2, with the same code in Lazarus it compile without any error.

Pouliot answered 4/4, 2017 at 7:19 Comment(0)
M
9

This seems like a compiler defect. You can declare the type like this:

TComplexArray = Array[0..67108862] of TComplex;

and the compiler will accept the declaration. Note that 67108862 = High(Integer) div Sizeof(TComplex) - 1.

You can avoid hard coding the upper bound by declaring a constant:

const
  ComplexArrayUpperBound = High(Integer) div Sizeof(TComplex) - 1;

type
  TComplexArray = Array[0..ComplexArrayUpperBound] of TComplex;

This style of type declaration is very much out of fashion these days. I would strongly recommend that you use dynamic arrays. These will give you automatic clean up of dynamic memory, and allow the compiler to add range checking code for all your array access. That latter point is important since it will give you early warning of bounds errors in your code.

If you aren't allocating the arrays and instead declare these types to enable array indexing, then it is probably simpler to use {$POINTERMATH ON}.

Furthermore I would suggest that you use Double instead of Extended. It is highly unlikely that you would have any need for the 10 byte Extended type and switching to Double will half your memory requirements. Because of alignment, your TComplex is 32 bytes in size, but a Double based version would be 16 bytes. This saving will result in significant performance benefit due to better use of the cache.

Multiplicate answered 4/4, 2017 at 7:46 Comment(13)
Aren't these style of pointer types usually intended to allow array-style addressing into arbitrary memory, rather than anything to do with dynamic length arrays per se ? But I think that too is supported in modern compilers without the need for such tricks. Maybe not FPC (but I'd be surprised) ? :shrug:Boreal
@Boreal Typically you will allocate the memory, in which case dynamic arrays solve all the problems in one go. If the memory is allocated elsewhere and you are just handed a pointer, then you'd set {$POINTERMATH ON} to allow array indexing.Multiplicate
by using Double instead Extended the problem resolved, thanksPouliot
@David - Without the implementation details it's impossible to say whether dynamic arrays are the appropriate alternative in this case. The decls look pretty low-level, so working with pre-allocated buffers seems a distinct possibility of being involved at least. If you're going to expand the answer into mentioning points unrelated to the actual problem then in this case you should at least also mention POINTERMATH. Brevity is acceptable but fuller explanations are better. ;)Boreal
@Deltics: I don't think anyone will pre-allocate a buffer of almost 2GB. These declarations are the old-fashioned way to be able to access arrays through a pointer without raising an out of range error. More modern Delphis would use dynarrays or, if only passed pointers, pointer math.Quevedo
@RudyVelthuis Nobody is expecting an allocation of the entire type. But before dyn arrays you'd call GetMem to allocate an array of the desired size and then access it using the pointer type.Multiplicate
@David: that is exactly what I meant. The pointer type of the array was relevant, not the (huge) array type itself. That has been superseded by dynarrays and pointer math.Quevedo
@rudy Deltics was not suggesting such a large allocation. Nobody was.Multiplicate
Then how should I read "The decls look pretty low-level, so working with pre-allocated buffers seems a distinct possibility of being involved at least". If the buffers are pre-allocated, one can use the real size and doesn't need to declare such a huge type.Quevedo
@Rudy No, you read it as some other party allocates the memory and hands you a pointer to the first element, and the number of elements.Multiplicate
@rudy your original comment talked about allocating the entire arrayMultiplicate
@DavidHeffernan: Hmmm... I didn't read it like that, but I see that could be meant.Quevedo
FWIW, if the sample is a Double instead of an Extended, it compiles. I guess the compiler has some issues with Extended. Weird, and most definitely a bug.Quevedo
R
-1

Here is an old declaration that compiles just fine:

  PUserInfo0Arr = ^TUserInfo0Arr;
  TUserInfo0Arr = array[0..MaxInt div SizeOf(TUserinfo0) - 1] of TUserinfo0;

A few dozen of these are left in my code, I prefer the Inc(PUserInfo0Arr) now.

Realistic answered 5/4, 2017 at 4:30 Comment(1)
But what about the type in the question?Multiplicate

© 2022 - 2024 — McMap. All rights reserved.