When (re)allocating an array with very large number of elements using
the MallocArray or ReallocArray macros, the calculated size of the array
could overflow size_t and less memory would be allocated than requested.
Add new functions for (re)allocating arrays that check the size and use
them in the MallocArray and ReallocArray macros.
This couldn't be exploited, because all arrays that can grow with cmdmon
or NTP requests already have their size checked before allocation, or
they are much smaller than memory allocated for structures to which they
are related (i.e. ntp_core and sourcestats instances), so a memory
allocation would fail before their size could overflow.
This issue was found in an audit performed by Cure53 and sponsored by
Mozilla.
Some libc calls like memcpy() expect the pointer to be valid even when
the size is zero and there is nothing to do. Instead of checking the
size before all such calls, modify ARR_GetElements() to return a pointer
to the array instance itself if data was not allocated yet.
It's not expected we will work with such large arrays anytime soon, but
better be safe than sorry.
Also, limit the number of elements to 2^31-1 to prevent infinite loop in
the calculation of allocated elements.