In the early days of DIY DCC decoders, the preferred method of DCC decoding logic was to write precisely metered sections of hand-written assembly code that sampled the DCC input. These were based on the venerable 16C84 running at 4MHz.
But with modern PICs running at 32 MHz and over, it is possible to write an interrupt driven DCC decoder. This means no more intricately hand-coded assembly. You could write it in C without worrying too much about the code generated - within reason. So what is a good interrupt period to sample the DCC input? OK, 22us. But how accurate does this period have to be? Can we use the internal oscillator in a PIC? Yes, we can!