Last month when I started experimenting with openbios, I found the
documentation to be misleading about which files to call from grub.
Here's a patch to the README which I hope clarifies the matter:
cygwin ( http://cygwin.com ) is a linux shell /emulator that runs under
Windows. It uses standard gcc and ld ,
We thought this would be the easiest way to get our developers going
with OpenBIOS as a development simulator...
Is anyone out there running OpenBIOS as an application under Windows (XP
or 2000)?
Hi guys, I'm trying to build OpenBIOS under Cygwin . I get the
following errors when I build it:
wenning@SHARPSTICK:~/src/openbios/openbios$ make
+ Entering libc
+ Entering kernel
+ Entering toke
+ Entering forth
+ Entering modules
+ Entering drivers
+ Entering fs
+ Entering grubfs
+ Entering arch/x86
BUILDING plain.image
ld: PE operations on non PE file.
make[1]: *** [plain.image] Error 1
make: *** [sub-arch/x86-all] Error 2
any ideas?
Hi -
We're looking at using toke, detok in our development process.
Our current setup to tokenize device drivers uses a "script" similar to
this:
----- start kng_sr_script -----
\ define these words for the tokenizer
tokenizer[
0 constant ?SYSTEM-ROM
0 constant COMMENT-OUT
0 constant ?DEBUG-TX
0 constant ?DEBUG-RX
h# 0110 constant ibm-Code-Revision-Level
h# 17D5 constant ibm-VendorId
h# 5831 constant ibm-king-DeviceId
h# 020000 constant ClsCode
]tokenizer
fload kng_main.of
----- end kng_sr_script -----
Then in kng_main.of, we do:
----- start kng_main.of -----
....
tokenizer[ hex ibm-Code-Revision-Level decimal ]tokenizer SET-REV-LEVEL
tokenizer[ hex ibm-VendorId ibm-king-DeviceId ClsCode decimal
]tokenizer PCI-HEADER
FCODE-VERSION2
...
----- start of code...-----
The version of toke I downloaded doesn't like this for 2 reasons:
1. Constant declarations in TOKENIZER[ ... ]TOKENIZER blocks emit
bytes/tokens to the output,
so we get extra stuff before the PCI header. I don't think it should
do this; code inside tokenizer blocks
should not be in the output FCODE. Is this correct?
2. Previous constant declarations aren't being looked up/interpreted
when in Tokenizer mode. For
example,
tokenizer[ hex ibm-Code-Revision-Level decimal ]tokenizer SET-REV-LEVEL
causes a "empty stack: error. I have to change it to
tokenizer[ 0110 ]tokenizer SET-REV-LEVEL
to make it work. Is this a bug?
I'm a little new to this; basically what I'm asking is if this behaviour
is what was intended, or should we attempt to fix it and submit a fix to
you.