I was recently learning about nodes and how they work in lists, and to test myself, I decided to write my own LinkedList implementation in Java. Then I decided, that since there is already a LinkedList out there, why not make it sorted? So now it is a SortedList.

First check if the element being attempted to be added is null. If it is, throw a NullPointerException, as I want the list to be "null-free" (Only because null can't really be compared, because it will throw a NullPointerException when attempting to compare).

Increment size.

Check if the list is empty (i.e. if the first Node it null)

If it is, create a new Node with the value of the added element and set it as the first. Otherwise, loop through the elements in the list until one larger than the element attempt to be added is found. Then, add the element there.

2 Answers
2

The way you keep the list sorted while adding elements is I think as fast as it gets with a linked list.
If you were using a data structure that allows fast random access,
then you could find the right insertion point faster by binary search,
but then you'd pay the penalty of array copying when shifting the rest of the elements,
so the overall benefit is questionable.

Keep in mind that if you have N elements and want them sorted,
then it's faster to add them in a regular list and then sort the list.
Sorting is typically bounded by \$O(N \log(N))\$,
while keeping an automatically sorted list is bounded by \$O(N M)\$.
If you really need the list sorted at all times,
for example while adding items you also do other stuff that needs the list being built sorted,
then your list is faster than re-sorting every time.
I rarely see such situation in practice.

What is the time-complexity of the methods? (Just want to find out)

The methods other than add don't benefit from the fact that the list is sorted.
For example indexOf (and contains using it), remove, and others will iterate until the end, even after a greater element is reached and therefore you could stop traversing the rest of the elements.

The worst-case complexity of methods that work with a single element like add, contains, remove is \$O(N)\$: you have to iterate over all elements.
The optimization I suggested to use the sorted property everywhere won't change this.

The worst-case complexity of methods that work with a collection of elements like addAll, containsAll, removeAll is \$O(N M)\$ by the same reasoning, where \$N\$ is the size of the list, and \$M\$ is the number of the elements in the parameter.

Are there some bad practices in there?

for loops with empty body are not great.
Especially if the loop condition contains multiple statements like this one:

It's a subtle thing,
but your sorting logic is not stable.
It would be better to make it so.

Bugs

You have a couple of bugs that need to be corrected:

toString crashes when the list is empty

remove(0) crashes

Node.equals crashes when next == null or value == null

Misc

In other implementations of List, toString returns values enclosed in [ ... ],
for example [1, 2, 3, 3, 3, 4].
I recommend to do likewise.

I stumbled upon this when I wanted to write a unit test with assertEquals(Arrays.asList(...), yourList),
realized that doesn't work because listIterator is not supported,
so I tried to work around that with assertEquals(Arrays.asList(...).toString(), yourList.toString()).
So I recommend to implement listIterator.