A random variable is just a mathematical object designed to capture a notion of uncertainty about the value that the variable takes. A simple example of a random variable is the outcome of a (not necessarily fair) coin toss. We know the outcome must be one of $H$ or $T$, but we don't know which until we've actually tossed the coin, allowing the uncertainty to resolve itself.

Stochastic is, in the words of my stochastic analysis professor, a posh term for 'random'. Hence, stochastic processes and random processes are essentially the same thing.

A stochastic process is just a special type of random variable. You can think of it as a collection of random variables, with some index -- typically time. Continuing with the coin tossing example, a (finite or infinite) sequence of coin tosses is a stochastic process. Typical in the theory of stochastic processes is that the index set, which we'll think of as time, has some special properties. For instance, the index might be related to the revelation of information. In the coin tossing example, you might naturally assume that you only observe each coin toss one at a time, and when you know the outcome of the $n^{\text{th}}$ coin toss, you also know the outcomes of preceding coin tosses.

Markov chains/processes (I, like Wikipedia, prefer to think of them as the same thing, but others might want to make some technical distinctions between them) are a special type of stochastic process. In particular, their distinguishing feature is the Markov property of memorylessness.

However, not all stochastic processes are memoryless. A simple (albeit admittedly not very natural) example might be a coin whose probability of coming up $H$ depends on the outcome of last $5$ coin tosses (whenever you've tossed it at least $5$ times). On the other hand, a canonical example of a Markov process is a random walk.

One way to think about the Markov property is to ask yourself if more observations of past outcomes other than the most recent one gives you more information about the likely future outcomes of the stochastic process. If observations other than the most recent one contain no additional information, the process is Markov. My strange example is clearly not Markov, since you would have different beliefs over the distribution of the next toss depending on whether you knew only the most recent outcome, or outcomes before that. The random walk is Markov since where the path currently is tells you everything about where it is going to go next -- additional observations of its previous history gives you no additional information.

A more natural example of a stochastic process that isn't Markov appears in the Wikipedia article on Markov Chains. This is essentially the same as drawing differently coloured balls from an urn without replacement, which is a common source of examples in probability theory.