Hi everybody,
There has been some talk recently in a smaller group where coreboot needs to improve the most in public perception, and how to get there.
Consensus has been that we're doing a pretty bad job at promoting all the hardware that we support in each coreboot version.
There's board-status which I started _years_ ago in the hope that somebody picks up the slack, but everybody has been busy, myself included. By now the collected information of the last 7.5 years is compiled into a 12MB HTML file that takes ages to render on moderate hardware (beware: https://www.coreboot.org/status/board-status.html), and the process to collect that data is mostly manual using pretty poor tooling. (Most of the links on that page don't even work anymore (which I'll fix) due to gitweb/cgit/gitiles changes on review.coreboot.org (and I only noticed by chance now).)
Meanwhile, there are several parties that boot test the hardware they care about regularly, with (often internal) information about how well coreboot does there.
We can't expect all those existing systems to converge into a single testing framework, but we could make it a single test result reporting framework.
To this end, I invite people interested in that topic to chime in on this email thread so that we can discuss what we could do to provide a common place with information about which coreboot versions are bootable on which boards in a way that makes sense for everybody: users who are interested in such data as well as testers that already collect it but have no way to publish it.
Thanks, Patrick
On 6/16/21 8:46 PM, Patrick Georgi via coreboot wrote:
Hi everybody,
Hi Patrick, thank you for bringing this topic to ml.
There has been some talk recently in a smaller group where coreboot needs to improve the most in public perception, and how to get there.
Consensus has been that we're doing a pretty bad job at promoting all the hardware that we support in each coreboot version.
There's board-status which I started _years_ ago in the hope that somebody picks up the slack, but everybody has been busy, myself included. By now the collected information of the last 7.5 years is compiled into a 12MB HTML file that takes ages to render on moderate hardware (beware: https://www.coreboot.org/status/board-status.html https://www.coreboot.org/status/board-status.html), and the process to collect that data is mostly manual using pretty poor tooling. (Most of the links on that page don't even work anymore (which I'll fix) due to gitweb/cgit/gitiles changes on review.coreboot.org http://review.coreboot.org (and I only noticed by chance now).)
Meanwhile, there are several parties that boot test the hardware they care about regularly, with (often internal) information about how well coreboot does there.
On of this parties is 3mdeb and we publish regression 150 test results for v4.0.x and mainline on 6 platforms: https://docs.google.com/spreadsheets/d/1_uRhVo9eYeZONnelymonYp444zYHT_Q_qmJE...
We can't expect all those existing systems to converge into a single testing framework, but we could make it a single test result reporting framework.
This topic was discussed many times on various conferences and OSFW Slack. I believe contest aim to be that framework, please correct me if I'm misinterpreted something from OSFC'20.
Missing things from perspective of sending test reports from 3mdeb validation infrastructure is REST API definition that can receive required board status data.
For basic support maybe Qubes OS-like HCL would be good enough? https://www.qubes-os.org/doc/hcl/
Please note Qubes do not force people to do git commits as in case of old board status, what lowers the barrier for reporting. https://github.com/QubesOS/qubes-hcl
To this end, I invite people interested in that topic to chime in on this email thread so that we can discuss what we could do to provide a common place with information about which coreboot versions are bootable on which boards in a way that makes sense for everybody: users who are interested in such data as well as testers that already collect it but have no way to publish it.
As it was mentioned on coreboot leadership validation system most probably will need tweaks/modifications/improvements of build system.
Some future ideas that may came from build system could be: 1. coreboot.org to host build results for given defconfig and make it accessible to regular users - of course this would be vanilla coreboot since we agreed that in most cases production builds are different from what we have on coreboot.org. This lowers the barrier since regular users do not have to compile coreboot by themselves. Having confirmed, working binaries for given configuration would be huge win and step towards "stable releases", which I advertised in other discussions. 2. Firmware binaries could be delivered by fwupd/LVFS infrastructure This would largely help to reach more end users since by simple switch in fwupd they would be able to seamlessly deliver alternative firmware to their devices including updates.
Best Regards,
Hi Patrick, thanks for picking this up. From a high level perspective this is very simple. I try to sum it up to help find an existing framework, which I tried several times but failed. When ignoring distributing binaries or blobs and only focussing on test reports you have something similar to a review application.
We need a) User management and authentication b) Users being able to add new "products". In our case that's a mainboard variant with a specific hardware configuration (memory, hard-disk, PCIe cards, ...). Every product has a custom set of properties you want to evaluate and for which you can send in a test report. That could be - boots OS x - device Y detected and properly configured - register Z locked and secure boot is possible c) For every product in the database authenticated users could then push status reports (reviews) which supply a test result for the properties defined earlier. The status-report would be submitted for a specific coreboot version and build config. You could 'add', 'list', 'sort', 'filter', 'delete' reports for "products" as well.
Regards, Patrick Rudolph B.Sc. Electrical Engineering System Firmware Developer
9elements GmbH, Kortumstraße 19-21, 44787 Bochum, Germany Email: patrick.rudolph@9elements.com Phone: +49 234 68 94 188
Sitz der Gesellschaft: Bochum Handelsregister: Amtsgericht Bochum, HRB 17519 Geschäftsführung: Sebastian Deutsch, Eray Basar
Datenschutzhinweise nach Art. 13 DSGVO
Patrick Rudolph B.Sc. Electrical Engineering System Firmware Developer
9elements GmbH, Kortumstraße 19-21, 44787 Bochum, Germany Email: patrick.rudolph@9elements.com Phone: +49 234 68 94 188
Sitz der Gesellschaft: Bochum Handelsregister: Amtsgericht Bochum, HRB 17519 Geschäftsführung: Sebastian Deutsch, Eray Basar
Datenschutzhinweise nach Art. 13 DSGVO
On Wed, Jun 16, 2021 at 10:35 PM Piotr Król piotr.krol@3mdeb.com wrote:
On 6/16/21 8:46 PM, Patrick Georgi via coreboot wrote:
Hi everybody,
Hi Patrick, thank you for bringing this topic to ml.
There has been some talk recently in a smaller group where coreboot needs to improve the most in public perception, and how to get there.
Consensus has been that we're doing a pretty bad job at promoting all the hardware that we support in each coreboot version.
There's board-status which I started _years_ ago in the hope that somebody picks up the slack, but everybody has been busy, myself included. By now the collected information of the last 7.5 years is compiled into a 12MB HTML file that takes ages to render on moderate hardware (beware: https://www.coreboot.org/status/board-status.html https://www.coreboot.org/status/board-status.html), and the process to collect that data is mostly manual using pretty poor tooling. (Most of the links on that page don't even work anymore (which I'll fix) due to gitweb/cgit/gitiles changes on review.coreboot.org http://review.coreboot.org (and I only noticed by chance now).)
Meanwhile, there are several parties that boot test the hardware they care about regularly, with (often internal) information about how well coreboot does there.
On of this parties is 3mdeb and we publish regression 150 test results for v4.0.x and mainline on 6 platforms: https://docs.google.com/spreadsheets/d/1_uRhVo9eYeZONnelymonYp444zYHT_Q_qmJE...
We can't expect all those existing systems to converge into a single testing framework, but we could make it a single test result reporting framework.
This topic was discussed many times on various conferences and OSFW Slack. I believe contest aim to be that framework, please correct me if I'm misinterpreted something from OSFC'20.
Missing things from perspective of sending test reports from 3mdeb validation infrastructure is REST API definition that can receive required board status data.
For basic support maybe Qubes OS-like HCL would be good enough? https://www.qubes-os.org/doc/hcl/
Please note Qubes do not force people to do git commits as in case of old board status, what lowers the barrier for reporting. https://github.com/QubesOS/qubes-hcl
To this end, I invite people interested in that topic to chime in on this email thread so that we can discuss what we could do to provide a common place with information about which coreboot versions are bootable on which boards in a way that makes sense for everybody: users who are interested in such data as well as testers that already collect it but have no way to publish it.
As it was mentioned on coreboot leadership validation system most probably will need tweaks/modifications/improvements of build system.
Some future ideas that may came from build system could be:
- coreboot.org to host build results for given defconfig and make it
accessible to regular users - of course this would be vanilla coreboot since we agreed that in most cases production builds are different from what we have on coreboot.org. This lowers the barrier since regular users do not have to compile coreboot by themselves. Having confirmed, working binaries for given configuration would be huge win and step towards "stable releases", which I advertised in other discussions. 2. Firmware binaries could be delivered by fwupd/LVFS infrastructure This would largely help to reach more end users since by simple switch in fwupd they would be able to seamlessly deliver alternative firmware to their devices including updates.
Best Regards,
Piotr Król Embedded Systems Consultant GPG: B2EE71E967AA9E4C https://3mdeb.com | @3mdeb_com
coreboot mailing list -- coreboot@coreboot.org To unsubscribe send an email to coreboot-leave@coreboot.org
On 6/24/21 8:33 AM, Patrick Rudolph wrote:
Hi Patrick,
Hi Patrick,
thanks for picking this up.
Agree. To give wider perspective topic was discussed many times. Key problem is that no one drives it. I'm not sure if there is any use in having a website which at least gathers all those discussions to not repeat the same thing again and again.
From a high level perspective this is very simple. I try to sum it up to help find an existing framework, which I tried several times but failed. When ignoring distributing binaries or blobs and only focussing on test reports you have something similar to a review application.
We need a) User management and authentication b) Users being able to add new "products". In our case that's a mainboard variant with a specific hardware configuration (memory, hard-disk, PCIe cards, ...). Every product has a custom set of properties you want to evaluate and for which you can send in a test report. That could be - boots OS x - device Y detected and properly configured - register Z locked and secure boot is possible
I already gave Qubes OS example during coreboot leadership: https://www.qubes-os.org/hcl/ https://www.qubes-os.org/doc/how-to-use-the-hcl/ https://github.com/QubesOS/qubes-hcl https://github.com/QubesOS/qubes-core-admin/blob/master/qvm-tools/qubes-hcl-...
IMO account creation, ssh/gpg keys configuration, git etc. are blockers for regular users who care about open source firmware and can limit number of reports we will get. Probably because of that Qubes OS do not require such additional overhead. But maybe there are reasons why we would like to limit number of reports.
Please note I'm not proposing Qubes OS approach as ultimate solution for coreboot, but rather show how other OSS projects solved the same problem. Of course we may need something more sophisticated, but more sophisticated example needs more resources.
Best Regards,
Hi Patrick,
I'm a long time coreboot user on a Lenovo X230 and T430.
On 6/16/21 1:46 PM, Patrick Georgi pgeorgi@google.com wrote:
There's board-status which I started _years_ ago in the hope that somebody picks up the slack, but everybody has been busy, myself included. By now the collected information of the last 7.5 years is compiled into a 12MB HTML file that takes ages to render on moderate hardware (beware: https://www.coreboot.org/status/board-status.html https://www.coreboot.org/status/board-status.html), and the process to collect that data is mostly manual using pretty poor tooling. (Most of the links on that page don't even work anymore (which I'll fix) due to gitweb/cgit/gitiles changes on review.coreboot.org http://review.coreboot.org (and I only noticed by chance now).
First of all, from the perspective of a user thank you very much for creating that board status list! Lots of good info on that page. I refer to it fairly regularly as it's helpful to see what versions of coreboot have worked for others and how my coreboot config settings and kernel log compare to what others have used / observed. I regret to admit that despite my intentions to contribute reports to that page I've yet to do so...
To this end, I invite people interested in that topic to chime in on this email thread so that we can discuss what we could do to provide a common place with information about which coreboot versions are bootable on which boards in a way that makes sense for everybody: users who are interested in such data as well as testers that already collect it but have no way to publish it.
Gerrit for whatever reason was and still is a bit foreign for me. I'm fairly comfortable working with gitea, gitlab, github, etc. with pull requests. I'm sure it would just take a bit for me to get familiar with the login and workflow, but I haven't done it yet.
It's probably stating the obvious, but with respect to the large single html page, breaking it up into separate pages by category (laptop, workstation, servers, etc.) would make things load quicker. Otherwise I thought the general table format with timeline of submissions further below was pretty good. One minor thing I notice is that the table is so big that I end up losing track of the column headers and try to remember what each column represents as I'm scrolling through.
Thanks again, Jason