In regards to my concern about Truecrypt using a 512 bit hash algorithm to gather entropy for keying a 256 bit symmetric algorithm, I have found out why it is not insecure for them to do this. The issue was that I had a misunderstanding of what it means for a hash algorithm to evenly distribute entropy of the input into the output. My original fear was that since a hash function evenly distributes entropy of the input to the output, that feeding a 512 bit hash algorithm with 256 bits of entropy would produce an output of 512 bits each containing half a bit of entropy, meaning taking 256 bits of this output to key a 256 bit algorithm would result in the key having only 128 bits of entropy. This is not correct, and the problem was my understanding of the meaning of "evenly distribute entropy of the input into the output". In reality, if you feed a 512 bit hashing algorithm 256 bits of entropy, each output bit will have 1 bit of entropy, up to 256 selected bits. This is not very intuitive to me, but it makes sense when you think of it as follows. Imagine there is an algorithm that takes a fair coin flip (producing a 1 or a 0 with equal probability) and outputs the input bit followed by the opposite of the input bit. Feeding this algorithm a 1 will then produce 10 and feeding it a 0 will produce 01. Now the entropy of the coin flip is 2^1 because the result can be either 0 or 1 with equal probability. The entropy of the second bit is also 2^1 then, because it can also be a 1 or a 0 with equal probability. So after feeding this algorithm 1 bit of entropy, the output contains two bits each with 1 bit of entropy, this is what is meant by evenly distributing the entropy of the input into the output. However, note that the output string itself also contains 1 bit of entropy, because it is either 10 or 01, so 2^1 possible states. So even though each individual bit of output has 2^1 entropy, the entire output string also has 2^1 entropy. This can hold true even if the algoritm repeats the pattern, feeding it a 1 produces then 10101010101010, and each individual bit in that pattern has one bit of entropy, but the entire string also has 1 bit of entropy because it is either 10101010101010 or 01010101010101, 2^1. So when you feed SHA512 256 bits of entropy, you can take the first 256 bits and they contain 256 bits of entropy, or you can take the second 256 bits and they contain 256 bits of entropy, even though the sum entropy of the 512 output bits contains 256 bits of entropy total. Any individual 256 bit selection from the 512 bits contains 256 bits of entropy, the even distribution of entropy doesn't mean that each of the 512 bits contains half a bit of entropy. Just thought I would clear that up.