In this page I'm going to discuss a few concepts that are relevant to the analysis of moral merit and guilt. First, the axiom of non-hypothetical choice: you can't make a choice on a situation you aren't in, and thus you can never be blamed or credited for what you "would" have done.
I think most people know that intellectually, but still fail to apply it. For example, trying to partially or fully absolve oneself for a bad action with "you would have done it too" is not a valid defense. It is valid to point to the degree of temptation as a way of partially absolving someone, and if that's what you're getting at then fine, but phrasing it this way is needlessly obscuring the truth and opening the door to a nasty argument by making it a personal counterattack. Likewise, saying someone is a bad person because they "would have done X" isn't valid, but it is valid to argue that since they are the kind of the person who seems like they "would do X", they have probably done X before.
Next, a corollary of that one: the axiom of inseparable motivation: Where there are multiple incentives for an action, it is not possible to do it "for" only some of them. If they were all there, they all affected how hard or easy the decision was for you, so you by definition were motivated by all of them. In other words, it can't be said that a person is more loyal to X than to Y until they are forced to choose between them.
Next, and more momentously; the axiom of necessary motivation: Every action undertaken by a person is either what was most tempting, most moral, or a compromise between the two. No one ever does what is both less tempting to them and less moral than another option they had and were aware of.
That might seem like I'm implying something I'm not. You might think there are loads of cases where someone acts in a way that seems both less enjoyable and equally or less moral than another action. The main thing you're likely missing if you're making that objection is the concept of ideals. All of us have internal codes of actions we consider "correct", and admire people who follow these codes - including ourselves. This creates an emotional incentive to act according to one's own ideology that is separate from the necessarily correct moral signals heard from one's conscience, because we want to feel proud of ourselves. Note that the formation of such an internal code can be explained as a self-interested way of increasing one's own happiness by enshrining how one already acts as "correct".
It's also worth talking about some psychological factors besides ideals that complicate temptation calculations:
Self-pity / rewarding oneself: when we have it bad, or after we do something morally good, we feel that we deserve some degree of indulgence, and so we find it harder to act good even when the degree of ostensible temptation is the same. Likewise, guilt can provide exhortation that makes it easier to do the right thing.
Instinct: the more you do something, the more natural and automatic it becomes. We are always uncomfortable doing what feels unnatural to us. And so this also explains a lot why two people with different habits might find the same action in the same situation more or less alluring. This obviously has a lot of overlap with ideals, but it's not the same.
Discontinuity: if the only actions in a situation are to do something extremely heroic or to do something extremely evil, the judgement of merit for either action is skewed. The evil action is less evil because the only way to avoid it was a great personal sacrifice, and the good action is less good because the only other way would've led to self-hate.
I should also discuss the concept of intention here. The notion is common that you can "intend" to do something you haven't actually had the opportunity to do yet, and that this has moral weight. My axiom of non-hypothetical choice might seem to preclude this. But on the contrary, it can emerge: "intending" to do something consists of telling yourself you're going to, which raises the temptation to do it (by biasing your instinct toward it) and thus raises the chance that you will, which has a fraction of the moral weight. The reason people do it is that it makes them more comfortable in the short-term. For example, if I know that in a few hours I'll have the chance to undergo an extreme pain for a noble cause, I'm likely to feel very anxious because I think I might do it and thus I'm partially experiencing the fear of the pain. Telling myself that I won't, and thus reducing the chance that I will, relieves my fear.