First matlab code, working with matrices for linear algebra

This is my VERY first stab at matlab - a question on my linear algebra homework that is requiring its use. As follows:

A graph with ?ve nodes and m directed edges can be described by an m x 2 matrix M by the following rule: label the edges as 1,2,...,m. row i of the matrix M is given by the vector [starting node number of the edge i, ending node number of the edge i]

(a) Write a MATLAB code (or use another package) that creates the m x 5incidence matrix A from M. Print A for the complete graph with ?ve nodes (labeled 1, 2, 3, 4 and 5) and all m = 10 edges such that: The edges are always directed from a smaller node number to a larger node number Both columns of M are increasing(b) Print basis for C(A) and C(AT), the column space and the row space of A.(c) Print L = AT A and ?nd the nullspace of L.(d) Compute the projection p of b = (1,2,...,10) onto the column spaceC(A). Since the columns of A are dependent, you will have to remove column 5 of A and calculate again the 4 x 4 matrix L1 as in part (c), so that L1 is invertible, to solve L1x = AT b and compute p = Ax.

Here is my code so far, I think I almost got it except for part a) and removing the column in part d). I'm kind of lost in part a) I'm confused as how one does this sort of thing in matlab. I've programmed before my only in java and C.