Two very newbish questions.
If you reduce the delays and leave the RGB example run long enough, it seems to enter a strange state.
I think it's do to this code
int i = 0;
while(1)
{
fade(COLORS[i%3], dir);
i++;
dir = !dir;
}
I think the variable i will eventually overflow going negative. I changed it to this and it seems to work longer.
int i = 0;
while(1)
{
fade(COLORS[i], dir);
if (++i > 2) i = 0;
dir = !dir;
}
Does this seem correct?
Second question:
Why would you do this:
void DigisparkRGBDelay(int ms) {
while (ms) {
_delay_ms(1);
ms--;
}
}
Instead of this:
void DigisparkRGBDelay(int ms) {
_delay_ms(ms);
}
Is there some side effect like blocking interrupts or something?
Thanks.