memory in the last arena-set (1/2 on average). In trade, we get
back the 1st slot in each arena (ie 1.7% of a CV-arena, less for
smaller types). The recovery of the wasted space allows use of
- small arenas for large, rare body types,
+ small arenas for large, rare body types, by changing array* fields
+ in body_details_by_type[] below.
*/
struct arena_desc {
char *arena; /* the raw storage, allocated aligned */
struct arena_set;
/* Get the maximum number of elements in set[] such that struct arena_set
- will fit within PERL_ARENA_SIZE, which is probabably just under 4K, and
+ will fit within PERL_ARENA_SIZE, which is probably just under 4K, and
therefore likely to be 1 aligned memory page. */
#define ARENAS_PER_SET ((PERL_ARENA_SIZE - sizeof(struct arena_set*) \
case OP_ENTERSUB:
case OP_GOTO:
- /* XXX tmp hack: these to may call an XS sub, and currently
- XS subs don't have a SUB entry on the context stack, so CV nad
- pad deteminaion goes wrong, and BAD things happen. So, just
- don't try to detemine the value under those circumanstances.
+ /* XXX tmp hack: these two may call an XS sub, and currently
+ XS subs don't have a SUB entry on the context stack, so CV and
+ pad determination goes wrong, and BAD things happen. So, just
+ don't try to determine the value under those circumstances.
Need a better fix at dome point. DAPM 11/2007 */
break;