Daniel Dennett blue-balled me. Now, to be fair, the book is about free will, not morality, but just the same, he hinted that he'd be giving an evolutionary view of morality and then stood me up.
This crap about morality being what's good for the community is all well and good, but it only works if you have some non-arbitrary way of defining what the community is. If I make the community really small, I can be completely selfish. If I make it really big, I can be crazy altruist. I can pretty much slice and dice it, stretch it like an amoeba to justify any moral feeling I might have. I suspect Wittgenstein linguistic theories run into the same problem.
There will always be a choice---do I flip the switch and divert the runaway trolley from the track with the baby on it to the track with the twelve healthy teens? I look for the answer. Is morality what's good for the community? Fine, just let me know which community matters. (Also, you'll have to give me some definition of the good that I'm trying to maximize.)
What other options are there? Maximize my own reproductive fitness perhaps? That's problematic for two reasons: 1. our genes' interests are not perfectly aligned, so even if I'm supposed to spread my genes--which genes? More importantly, 2. just because I'm designed to duplicate genes, it doesn't follow that I have to or even ought to do that.
I see no alternative than simply "feeling" the morality out of the situation. Conjure up some intuition pumps, identify some moral facts, intuit a principle or two, and let have it. Believe, as I do, in some Platonic morality that obtains objectively and our ability to sense it---somehow.