I'm asked to plot current vs. voltage for two circuits: one with just a constant voltage source and a forward-biased diode in series, the other with a constant voltage source, a forward-biased diode, and a resistor in series. All the information I have available for this question I've attached as a jpg. I'm basically having difficulty understanding how adding a resistor in series with a diode will affect the current of that circuit. I'm assuming it must somehow, I just don't know how to model it mathematically. My teacher apparently assumes this knowledge must be self-evident, but I still need someone to point out the obvious to me. Thanks in advance for any help.