On 02/08/2016 11:29 AM, Julius Werner wrote:
On 08.02.2016 12:10, Patrick Georgi via coreboot wrote:
2016-02-04 10:35 GMT+01:00 Patrick Georgi pgeorgi@google.com:
during the review of some commits that are in the process of being upstreamed from Chrome OS, people noticed that chipset drivers like to define their own TRUE/FALSE defines (sometimes prefixed to), and I have seen a bunch of #define BIT{0-31} ..., too, because that seems to be the house rules in some firmware communities.
Seems like for the BIT defines, all variants are popular. Any objection with moving them to src/include instead of having various copies across the tree?
How about something like src/include/please_dont_use.h ?
I don't know about that but I'd have no objection to a comment in the file itself that these are vile constructs.
If we agree on what not to use, why not just fix it completely? coccinelle is great for that sort of thing. (For the record, I don't mind BIT(x) too much but I also prefer (1 << x), since it's more consistent with multi-bit fields like (5 << x).)
Other than BITxx macros, really, what's the purpose of this? Sure, standardization sounds great, but can anyone point to reasons why _not_ standardizing results in loss of productivity? As I've said before, we do import code from other codebases, which oftentimes are not even self-consistent. There's loss in productivity from being forced to convert one style of bit macro to another.
Just from that, it seems that trying to "standardize" is a net loss. Pretty code is useless when it's slow and clumsy to write.
Alex