Thanks David! That is a really good point about the write protection.
I was thinking more about this lately and can't help but feel there would
need to be some type of heuristic involved. For example, consider these
three cases:
1. Only a 4kb chunk changed => Erase that 4kb chunk
2. All the 4kb chunks in a 64kb chunk changed => Erase the 64kb chunk
3. 50% of the 4kb chunks in a 64kb chunk changed => Should the 8 individual
4kb chunks be erased or the whole 64kb chunk?
The 3rd case is where I think there would need to be a heuristic. Thoughts?
- Ryan
On Tue, Nov 19, 2019 at 10:00 PM David Hendricks <david.hendricks(a)gmail.com>
wrote:
Hi Ryan,
What would be the best solution here? Would it
make sense to submit this
change for this particular chip? I understand there may
also be a
speed/wear tradeoff.
I am also interested in the more general case. Do larger erase sizes
tend to
reprogram a chip faster? If so, why does flashrom not always use
the larger sizes?
Flashrom knows the current contents of the flash chip and the new
contents that will be programmed. So ideally it should
opportunistically use the largest block erase size for any given
chunk, whether it's 4KB or 64KB or the entire chip. The work just
needs to be done to implement that.
Of course things are never quite that simple and we must also check
that it doesn't interfere with chip or programmer enforced write
protection. For example if the chip has certain block ranges
protected, or in the usual Intel platform case with the flash
descriptor regions and permissions (and opmenu stuff).