View Single Post
Old 08-17-2013, 03:24 AM   #9 (permalink)
nickdigger
EcoModding Apprentice
 
Join Date: Aug 2009
Location: terra firma
Posts: 138
Thanks: 4
Thanked 24 Times in 22 Posts
Quote:
Originally Posted by t vago View Post
I can. My code does not use any sort of loop-based delay2() function, nor does it use any sort of microSeconds() function. When I attempted last night to completely remove the buffered feature, I had to add back in a delay2() and a microSeconds() and a microSecondsLength() function. That is what caused the bloat to occur.
Mine uses delay_us (20 bytes) and delay_ms (48 bytes), which calls milliseconds (48 bytes). Now that i look at it, i will probably merge milliseconds() into delay_ms(), since it basically just returns the timer count, and it is only ever called by delay_ms. (Edit: so far, no good. It's "so efficient" that the compiler wants to inline the new delay_ticks everywhere, costing me 300 extra bytes)

Quote:
I'm strongly tempted to use the timer2 overflow interrupt handler to handle processing the LCD and serial buffers.
I had assumed you were already doing something like that. It might be trivial to add another "event" chain to the handler, for each output.

Last edited by nickdigger; 08-17-2013 at 04:30 AM..
  Reply With Quote