Recently, someone asked me to explain the SD card's CRC algorithm. I had forgotten a lot so I was not able to tell him much but I got curious and decided to document exactly how it works here on this blog post so I can refer to it later.
First of all, the official SD card documentation (in section 4.5 as of the time of this post) briefly describes the CRC algorithm and gives a few examples. Unfortunately, its description assumes that the reader is a math major in a masters program at a prestigious university rather than provide a simple algorithm that would make sense to a novice software developer :)
The wikipedia article ( http://en.wikipedia.org/wiki/Cyclic_redundancy_check ) is a bit more friendly and gives a helpful example.
SD cards use two CRC algorithms: one is called CRC7, the other is called CRC16. I will explain CRC7 here. CRC16 is the same except its "polynomial" is different as is the amount of padding that the initial number must receive.
The polynomial for CRC7 is 0x89; the polynomial for CRC16 is 0x1021 which is based upon a standard called CRC-CCITT. More about that here.
The CRC7 algorithm works like this:
- Pretend that the data buffer you want to compute CRC7 from is a really huge number (big endian).
- Take this number and shift it left the number of bits that the result will have, in this case 7. (this seems to be standard for most CRC algorithms)
- Find the first bit (starting from the left) that is a 1 in the number and XOR that with the polynomial (where the polynomial is shifted left enough times that its highest bit lines up with the 1 that you found, I will provide an example later)
- Repeat the last step until the original number of bits (before you shifted it left by 7) are all 0's.
- The CRC result will be in the bottom 7 bits at this point.
Here is an example of how to get the CRC7 result that the SD card specs say you should get.