Patch Set 1: Code-Review-1
Dug a little into the chipset support for historical Intel
in the tree: Older ICHs implement the bit r/w but mention
to ignore it anyway, because everything goes to PCI. So that
should be fine.440BX doesn't seem to know it, though :-/ So this can't go
in as is. I'm open for suggestions. What should we do if
16-bit decode isn't possible? Restrict all i/o to 10 bits?
That could brick... Hmmmm, just warn about it? Instead of
skipping the device?I really don't like to reserve the VGA resources 64 times,
could do that, though.
The code in device.c seems to attempt to avoid VGA IO and "legacy" PCI IO windows when allocating (dynamic) resources. I am not convinced that code always works, and those windows would appear to be unnecessary for platforms where we can have programming of CTL_VGA=1, CTL_VGA16=1 and CTL_(NO)_ISA=0 for PCI_BRIDGE_CONTROL.
I think we could first throw errors if assigned resources overlap some of those aliased (and enabled) IO address ranges. Should be easy to (manually) adjust the conflicting static resources when they appear.
3 comments:
So.. we are shifting the start of resource upwards here.. But nothing guarantee the end of resource does not hit the range of aliased decodes?
Patch Set #1, Line 794: /* VGA is first add-on card or the only onboard VGA. */
I thought it would be the last add-on dev_find_class() call returns? Just assuming here it would be the one with highest bus number.
Patch Set #1, Line 807: bridge_ctrl = pci_read_config16(bus->dev, PCI_BRIDGE_CONTROL);
The spec for these bits match, so it is fine (see cardbus spec 4.5.2. […]
The cardbus spec I found, dated 2001, claiming release 8.0, documents bit 4 reserved returning zero, with no references to 16bit VGA decode.
To view, visit change 35516. To unsubscribe, or for help writing mail filters, visit settings.