In my opinion, by far the most negative effect of WWI was WWII. The first war led pretty directly to the second.

The First World War led to a situation in which Germany (especially) felt the need for revenge and the need to upset the status quo. The Treaty of Versailles that ended WWI took much of Germany's territory away from it and imposed crushing financial reparations on it. These led to a desire for revenge. Germany also wanted to put an end to aspects of the treaty that prevented it from having a serious military.

By punishing Germany harshly, the Treaty of Versailles led Germany to have a motive for starting WWII. This makes WWII the main negative effect of WWI.

The previous post was very strong. I would add that from an intellectual standpoint, one of the most profound effects of World War I was the embrace of fragmentation or disunity that embraced so much of the world outside of the First World War. The war was something where nothing really substantial positive or productive emerged. Europe was completely crippled from the emotional, political, and psychological costs of the War. Each nation was forced to rebuild from virtually nothing. At the same time, there was very little that was reaffirmed from the war. A conflict that was predicated upon nationalism, patriotism, and any other "-ism" that was seen as a form of structure was completely undermined by the destructive and brutal nature of the war. In the end, this psychological malaise was something that could be seen as a negative effect from WWI.