The trick with evolution is that while it is certainly not absolute, it is both stochastic and contextual. Behaviors that yield a certain probability of genetic propagation in one situation may yield a different (perhaps lower, perhaps higher) probability in another situation.
Zarathustra wrote:
The thing you don't seem to understand is that "science", in the broadest sense of the word (i.e. empirically-based rationality), is only descriptive, not prescriptive. We can can use reason to deduce how things are, but not to deduce how things should be, because should-ness is fundamentally arational (but not irrational).
Let's say I harbor desires that are evolutionarily counter-productive. How does this make them irrational? Evolution is a matter of fact. Those that do not act in such a way as to further their genetic lineage won't further their genetic lineage. This doesn't mean that they're any "better" or "worse" than those who do.
Also, I really don't see how you can categorize all value judgments on the grounds that they are arational, yet refuse to define the quantity (should-ness) that is being measured.
Let me try to be clear. The initial propositions (base desires) are taken as givens. If they are consistent, any derivations from them will be consistent and the system will be rational. If they are inconsistent, derivations from the initial propositions may possibly be inconsistent and the system will be irrational.
In our case, our desires often conflict to varying degrees. We cannot simultaneously fulfill all of our desires, so we trade off degrees fulfillment of one desire for another[1]. (Yes, I know this has been said above).
Now depending upon how much weight (importance) we attach to each desire, some traidoffs may achieve a greater total fulfillment than others. The system that achieves the maximum possible fulfillment will be the most consistent one[2] possible <i>given the starting assumptions.</i>
So given the same initial set of desires, relative importances and situational context, some moral systems ARE better[3] than others.
[1.] If you're familiar with the math, picture a constrained optimization problem.
Otherwise don't worry about it. You get the idea anyway.
[2.] Possibly not unique.
[3.] When viewed from within the framework of that system.
EDIT: A much less wordy way to put this would be to say: Under identical situations, different (moral, evolutionary) strategies to achieve the same goals can be compared in terms of how well they meet these goals. In this context, we CAN say that one system is better than another.