Hi there,
This seems pretty simple.
On 02/04/2016 01:35 AM, Patrick Georgi via coreboot wrote:
Hi all,
during the review of some commits that are in the process of being upstreamed from Chrome OS, people noticed that chipset drivers like to define their own TRUE/FALSE defines (sometimes prefixed to), and I have seen a bunch of #define BIT{0-31} ..., too, because that seems to be the house rules in some firmware communities.
I think we should seek uniformity here: decide on some style, recommend it, clean up the tree to match, and help people stay consistent through lint tests. What I don't know however is what that style should look like.
So, two topics:
- TRUE/FALSE
Do we want such defines? If so, TRUE/FALSE, or true/false, or True/False, or ...?
Standardize on stdbool.h, bool, true, false. It's the logical choice.
- BIT16 vs BIT(16) vs (1 << 16) vs 0x10000
I don't think it makes sense to go for a single one of these (0x3ff is certainly more readable than BIT11 | BIT10 | BIT9 | BIT8 | BIT7 | BIT8 | BIT5 | BIT4 | BIT3 | BIT2 | BIT1 | BIT 0), but I doubt we need both BIT16 and BIT(16).
BIT16, no. Who wants 64 defines? Seriously, this sort of stuff raises blood pressure.
BIT(16), sure. Actually, I used this trick a while back on the allwinner port. BIT(16) meant "just a bit", whereas (1 << 16) meant an enum (usually also paired close to 2 << 16, 3 << 16, etc).
0x10000 also fine. If it's in a list of defines, why not?
#define RON 0x8000 #define JOHN 0x10000 #define MARY 0x20000
We all know Chromium guys like to copy paste code from all over the interwebs, and quite frankly, why make their lives harder with harsh guidelines? These are small inconsistencies that I don't think bother anyone too much.
Alex