[OpenBIOS] toke core dumps

Prasanna Kumar pras_iitb at yahoo.com
Mon Oct 3 05:16:43 CEST 2005

Hello All,

When I try to tokenize a large FCode source file, the
tokenizer crashes with Segmentation Fault.

Debug messages:

tst.fth:6763: debug: tokenizing control word 'endof'
tst.fth:6764: debug: read token 'endcase', length=7
tst.fth:6764: debug: matched internal opcode 0x0013
tst.fth:6764: debug: tokenizing control word 'endcase'
tst.fth:6764: debug: endcase offset 0x8401
tst.fth:6764: debug: endcase offset 0xffff83f1
Segmentation fault (core dumped)

When I looked at the code, the offending function was
in emit.c :

s16 receive_offset(void)
        s16 offs=0;

        if (offs16) {
                offs= ((*opc)<<8)|(*(opc+1));
        } else {
        return offs;

If the msb of *opc is set 1(Value >0 0x8yyy), the offs
gets changed to ffff8yyy. 

Changing offs to u16 and receive_offset returning
value of u16 type would solve the problem. Change the
prototype for receive_offset in emit.h as well.

Modified code:
u16 receive_offset(void) <----This line changed
        u16 offs=0;      <---- This line changed

        if (offs16) {
                offs= ((*opc)<<8)|(*(opc+1));
        } else {
        return offs;

Steps to reproduce the problem very simply:
- Create a fcode source file with around 7000
- Use the following simple test code:

: test ( -- )
   1 to temp1
   2 to temp2
   temp1 case
     1 of
       ." One" cr
     2 of temp2 
       ." Two" cr

- Run tokenizer on the file. toke will core dump.


Yahoo! Mail - PC Magazine Editors' Choice 2005 

More information about the OpenBIOS mailing list