How do I write an algorithm that tests if a relation on a matrix is a function

Hello all!

I have a problem that I don't know where to start with. I have to write an algorithm that finds out if a relation is a function. Okay, the question goes like this:

Write an algorithm that receives as input the matrix A of relation R on set X and tests whether R is a function. The matrix A will be n x n , where n = |X|. Find the best-case and worst-case number of comparisons used by your algorithm in terms of n.

Does anyone know how to write this algorithm? I hope so because I'm stuck.

Cheers,
Henrik

May 10th 2007, 07:05 AM

CaptainBlack

Quote:

Originally Posted by cpklas

Hello all!

I have a problem that I don't know where to start with. I have to write an algorithm that finds out if a relation is a function. Okay, the question goes like this:

Write an algorithm that receives as input the matrix A of relation R on set X and tests whether R is a function. The matrix A will be n x n , where n = |X|. Find the best-case and worst-case number of comparisons used by your algorithm in terms of n.

Does anyone know how to write this algorithm? I hope so because I'm stuck.

Cheers,
Henrik

I think you need something like the following to start this (I am assuming R is a binary relation):

Suppose that we number the elements of X, so X={x1, x2, ..., xn}

Lets suppose that the matrix A has 1 in position (i,j) if (xi,xj) is in R, and
0 otherwise.

R is a function iff for all (x,y) in R (x,z) => if (x,z) is in R then z=y, then we
may write r(x)=y.

Hence R is a function iff there is at most 1 "1" in each row of A.

So your algorithm has to check that each row of A contains no more than 1 "1"
and all other elements are 0.

RonL

May 10th 2007, 07:15 AM

Plato

Quote:

Originally Posted by CaptainBlack

So your algorithm has to check that each row of A contains no more than 1 "1" and all other elements are 0.

That may well be the case. But I think that it depends upon the particular textbook. All of the set theory texts I have ever used require the the domain of the relation be the entire set in order for the relation to be a function. If that is the case here then each row would have exactly one 1.

So check the definition of function in your textbook.

May 10th 2007, 09:10 AM

CaptainBlack

Quote:

Originally Posted by Plato

That may well be the case. But I think that it depends upon the particular textbook. All of the set theory texts I have ever used require the the domain of the relation be the entire set in order for the relation to be a function. If that is the case here then each row would have exactly one 1.

So check the definition of function in your textbook.

Agreed, look up your definitions.

RonL

May 10th 2007, 04:12 PM

cpklas

Hi guys, thanks for your help. It made things a lot more clear to me.

The definition in my book is that every row must contain exactly one 1 for the relation to be a function.

What about the values in the elements of A which are not 1's? Or are
you assuming the A is a matrix of 0's and 1's. I would have though you
would at least have required as assumption statement of some kind.

Something like:

Assumption: A is the incidence matrix of a relation, and so consists only of 0's and 1's

Your algorithm makes exactly 1 comparison for every element of A and one to check
if ones==1 for each row, so the best and worst case number of comparisons are equal
to n^2+n.

RonL

May 10th 2007, 09:17 PM

cpklas

Quote:

Originally Posted by CaptainBlack

What about the values in the elemnts of A which are not 1's? Or are
you assuming the A is a matrix of 0's and 1's.

RonL

Well, does that matter? If the definition is that every row must contain exactly one 1 for the relation to be a function then it shouldn't matter what the other elements are? Or?

I had a look at the second part of the question (if my algorithm works): Find the best-case and worst-case number of comparisons used by your algorithm in terms of n.