[coreboot] native video init question

Nico Huber nico.h at gmx.de
Wed Nov 16 00:46:45 CET 2016


On 14.11.2016 06:43, Charlotte Plusplus wrote:
> Hello
> Here is the current status of my W520:
>  - native video init gives a garbled image (picture available upon request
> lol). it may be due to the resolution of the screen being hardcoded
> somewhere, or more likely me using the wrong information since the W520
> uses 1980x1080
>  - non native video works fine

I've seen a garbled image, too, lately. When I built with native
raminit by chance but with a completely different gfx init code
(3rdparty/libgfxinit). So it might still be some problem in the
raminit. This was also on an Ivy Bridge system with four DIMMs,
btw. I suspected that the native raminit just wasn't tested in that

>  - with native ram init, the memory tends to be unstable unless great care
> is used (basically blindly increasing SPD latencies and setting
> max_mem_clock_mhz=666)

>  - with native ram init, even with that sometimes the boots stop on:
> "discover timC write:
> t123: 1048, 6000, 7620"

>  - with non native ram init, there is no video

>  - various dmaerr unless iommu is disabled

>  - the USB3 controller is not showing in lscpi, even if I am quite sure it
> is on the right PCIE (likely due to the RCBA32 I am using)

>  - the modem codec is not detected by snd-hda-intel even if the probe is
> forced

>  - some other unknown things may not work.
> At the moment, I am trying to advance with native video first as it is a
> low hanging fruit: I'm quite sure the gfx.did or LVDS settings in my
> devicetree must be wrong.
> How can I guess the right ones after booting with the videorom?

Well, I don't see any setting that could really break something. The
code might just be buggy. I'll go through the settings anyway, taking
src/mainboard/lenovo/t520/devicetree.cb as example:

> # IGD Displays
> register "gfx.ndid" = "3"
> register "gfx.did" = "{ 0x80000100, 0x80000240, 0x80000410, 0x80000410, 0x00000005 }"

That's about ACPI, let's not care (the last appendix in the ACPI spec if
you want to have a look).

> # Enable DisplayPort Hotplug with 6ms pulse
> register "gpu_dp_d_hotplug" = "0x06"

Not native gfx init related, I wonder why the firmware should set this
at all.

> # Enable Panel as LVDS and configure power delays
> register "gpu_panel_port_select" = "0"                  # LVDS
> register "gpu_panel_power_cycle_delay" = "5"
> register "gpu_panel_power_up_delay" = "300"             # T1+T2: 30ms
> register "gpu_panel_power_down_delay" = "300"           # T5+T6: 30ms
> register "gpu_panel_power_backlight_on_delay" = "2000"  # T3: 200ms
> register "gpu_panel_power_backlight_off_delay" = "2000" # T4: 200ms

Those are real register settings, you can dump the whole GMA MMIO
space with `inteltool -f` (hoping that your system doesn't hang). The
registers are described in [1, chapter 2.4].

> register "gfx.use_spread_spectrum_clock" = "1"

This is set in DPLL_CTL bits 15:13 (there are two registers, just boot
with only the internal display and read the first register, 0xc6014 [1,

> register "gfx.link_frequency_270_mhz" = "1"

This setting is only documented for Nehalem ([3, 2.10.1] the lowest
8 bits). Linux always assumes 270MHz for Sandy/Ivy Bridge (which con-
tradicts our setting for the MacBookAir4,2 but that might be wrong).

> register "gpu_cpu_backlight" = "0x1155"

Just wrong see [2, chapter 3.9.2], it must not be higher than the up-
per 16 bits in the next register.

> register "gpu_pch_backlight" = "0x06100610"

The upper 16 bits give a divisor for the backlight PWM [1, 2.5.2]. The
lower give the duty cycle, iff bit30 of register 0xc8250 is set, which
we do not set (i.e. the gpu_cpu_backlight value above is in charge).
So the lower 16 bits can be just zero.

Hope that helps,




More information about the coreboot mailing list