Unfortunately there is not a 144hz 3440x1440p ultrawide around, that is why this monitor is so expensive because of the 144hz and IPS. Now for everyone complaining that it is only 2560x1080p, you will never notice that in games, it is only noticeable when doing other desktop tasks. And try pushing 3440x1440p at high/ultra settings and try to get 144fps in most games. That is almost as hard as trying to push 4k at 144fps. To drive new games at 3440x1440p at 144fps you will need 1080/1080 ti SLI. At 2560x1080p a single 1080 or 1080 ti can easily push games at ultra settings at 144fps. That is why 2560x1080p is better for gaming than 3440x1440p for the time being.

But there is one at 100hz, as basically nothing else has come close to the Predator X34 yet. There are plenty of disadvantages to this monitor using 1080p, and there are absolutely no disadvantages if it were any resolution higher, as you can still scale games to whatever your graphics card is able to run.
No one cares if you can push games to extreme framerates at 1080p because that was already possible years ago. Entire generations of GPUs ago. It's what has pushed people to use things like DSR on their given monitors because there haven't been as many good monitors at affordable prices at those higher resolutions. Other than that, the only thing that has held back monitor development were the updates necessary to the HDMI and Display Port standards.
So at the end of the day, this monitor here solves no issues and breaks no boundaries on the market. A solid step in no direction.

I agree with the points you're making, but I have one question of clarification. By "scale games to whatever your graphics card is able to run," do you mean graphical quality settings (AA, filtering, detail levels, etc.) or resolution? If it's the latter, then I would point out that not all games enable resolution scaling. And to choose, say, 1080p on a 1440p native monitor will result in a blurry image.

Nvidia has DSR in the drivers and works with ANY game, game doesn't see it, it only sees the 'virtual' resolution, so on a 1080p monitor you can basically emulate a 4k one (with 4x DSR), nvidia videocard will render the picture at 4k and then downscale to 1080p, with end result looking much better than a straight 1080p render.
https://www.youtube.com/watch?v=rSUSYaa6C9s

What @Cyphre was saying is that 1080p is such an outdated resolution standard that nvidia came up with DSR trick to make look a lot better because there's so much spare GPU power for that these days.

Ah okay, so this is like an improved approach to doing what supersampling does. But this inverts the issue I'm bringing up: rendering the image at a resolution lower than the native res of the screen, not higher. (Of course, rendering 720p, as opposed to 1080p, should look better on 1440p due to the even ratio of pixel density.) So I think it would need to upscale 1080p to the 1440p screen. Or allow for scaling at <100% screen resolution.

I'm talking about lower than 1440p (see previous comments in response to Cyphre), not lower than 1080p. I can see why that would be silly for PC gaming : ) Nevertheless, I think that "very easy" is a bit of a reach, unfortunately. I assume that by "pretty much any GPU these days" you're referring to 9- and 10-series cards? (We're talking about nVidia's post-9 series drivers, so I'll not mention AMD.) There are games that at highest settings, on a 980 or 1060, won't hit, let alone maintain, 60fps at 1080p (Rise of the Tomb Raider, GR: Wildlands). Of course, turn down some settings for a barely noticeable difference in image quality and that'll fix it. Still way better than the 30fps 900p upscaling that consoles do!
Edit: I should include an example case. I could see someone using a 3440x1440 monitor for work, but not having the GPU to game native at that resolution—at least not at the desired frame rate and quality settings. Though with that much screen space (I'm assuming it's 34"), one could use the nVidia control panel to prevent stretching the image and run it at 1920x1080 native (so the rest of the unused pixels are just black). Thanks for the DSR info, by the way!

I would have to disagree with you. Yes the X34 is a great monitor, but it still is only 100hz rather than 144-166hz. Now to your point where you just scale down your settings to work at 3440x1440p depending on your card, I can't agree with that. A game that is running at 1080p Ultra looks far better than a game running at medium at 1440p or 4K. And this monitor does break boundries on the market becasue it is one of the first 144hz ultrawide monitors. And remember this is 2560x1080 not 1920x1080p. Some game like shooters greatly benefit from high framerates, that's why some pro players in games like Overwatch play at the lowest possible settings to get the highest possible fps they can becasue it make the feel of your aim smoother. 3440x1440p is still too hard to drive with new games at ultra settings for 99% of PCs. 2560x1080p is still a better resolution for gamers, and I don't see what the disadvantages of it are for gaming as you mentioned. Now if we were talking about content creation, then yes this isn't a good option, a 3440x1440p monitor is a far better choice, but for gaming a 2560x1080p ultrawide makes more sense because you get the higher refresh rate and a resolution that can be run at high/ultra settings on a lot more PCs.

Also note that even with a card like a 1080 ti you still can't push all modern games at 1080p ultra settings at 144fps, try playing BF1, Ghost Recon Wildlands, Rise of the Tomb Raider, etc... 144fps at 1080p ultra in new games is still not possible with all games, even with top of the line hardware. Saying that 1080p gaming has been conquered is not true. 1080p will continue to be the standard for gaming for a long time, because while we are moving quickly with GPU technology, the advances in graphics are moving quicker. To truly standardize higher resolutions like 1400p or 4K for gaming, GPU technology will have to move faster. Right now if you said you were using a 1080 for 1080p gaming people would say you are crazy and that is overkill. Now it was the same with the 980 when it was new, now guess what? The 1060 basically is equal to the 980, and the 1060 is "The best 1080p gaming card" When the GTX 1180/2080 comes out, then the GTX 1160/2060 will have the same power as a 1080 for $250 and it will be the 1080p gaming card. It has been the same with past generations as well. Because graphical advancements just keep moving along with GPU advancements. To ever standardize 1440p or 4K gaming we have to reach a plateau of graphical technology and let GPU tech push far ahead.

I just want to mention the fact that it is fairly usefull to have a resolution above 1080p in quite a few games, at least if you're playing first person survival/shooter games, in particular. (at longer ranges it is a heck of a lot easier to make out the outlines of players, and a lot easier to see clearly what they're doing aswell as what you're actually looking at.)
Rust is a great example. You will often notice entities that are moving in the distance, but details such as clothing, weapons and size/if it is an actual player, will be noticeably harder on 1080 than it will be on 1440p/4k resolution.

Bullshit. 1080p will never be better at this monitor size than 1440p. Jeez, MD loves putting in shills on thus drop just to move this junk of monitor, because there is no way anyone with half a brain will buy this crap on its own merit unless he gets brainwashed😂

For sure. Plus, just to add to your point, a lot of recently released AAA games do indeed run quite well (Doom, Overwatch). Heck, part of the fun for me in playing older games is getting to rock their graphical settings at 1440p144 : )

"Now for everyone complaining that it is only 2560x1080p, you will never notice that in games, it is only noticeable when doing other desktop tasks."
I call B.S. on that statement! There's a considerable difference in games running at 1440P than 1080P. Why do you think that it takes more GPU processing power to run a game at 1440P? It's pushing more pixels and there's a higher pixel density.

I was referring to that you will not notice the screen door effect while gaming because your mind doesn’t focus on stuff like that during the heat of a game. And when did I ever deny that 3440x1440p was harder to run? Obviously it is and the two points you made have no correlation whatsoever.

Keep telling yourself that and I'm sure that that you and maybe others will believe you. I'm thinking that it's a defensive stance to rationalize your purchase of a 1080P monitor. Your 1st mistake is the notion that you have to run a game at the maximum refresh rate and FPS for it to be any good. The truth of the matter is that 1440P is a vast improvement over 1080P in games and desktop applications. You don't need the game to run at 144 FPS with a matching refresh rate for it to be "worth it". With a variable refresh rate monitor (whether you are using G-sync or Free-sync) it will constantly keep your refresh rate and FPS the same so that the gameplay is smooth and should eliminate tearing/stuttering and artifacts. As long as it's a playable FPS, you are probably not going to really notice what FPS the game is running at unless you have a frame counter or it dips below a playable frame rate.

I don't have a 2560x1080p monitor, I've used one before and it has not bothered me. I'm not denying that 3440x1440p looks better, obviously it does. I'm saying while gaming you probably will not be massively bothered by the lower resolution. All I'm saying is that for the time being, I would take a 2560x1080p at 144hz over a 3440x1440p monitor at 100hz. As someone who takes competitive shooters like CSGO and Overwatch seriously, telling me that I can't notice the difference between 144hz and 100hz is just flat out wrong. I'm also not saying that the FPS and refresh rates have to be the same, I'm saying I would take the higher FPS and refresh rate over the extra resolution for competitive gaming any day. If you take a game like Overwatch, pro players play at the lowest possible settings to have the highest possible FPS because the higher the FPS the smoother the mouse movement feels, the same applies to any shooter. The difference in mouse feel between 100fps and 200fps is massive and the feeling will massively affect your aim. The reason that 2560x1080p is superior for competitive is not just the currently high refresh rate, but the lower resolution allows for a higher frame rate, especially for those who don't have top tier hardware. Now me with a 1080 ti could absolutely game at 3440x1440p, but currently for anything competitive I play at 1080p 144hz to keep the higher FPS. Single player and casual games then I will switch over to my 4K 60hz monitor. Now if you don't play competitive shooters and have the computer horsepower, 3440x1440p 100hz you will probably enjoy more than 2560x1080p 144hz, but for anyone like me who takes competitive gaming serious, the extra resolution is in no way worth giving up refresh rate and FPS.

I'm certainly not a competitive online gamer and I agree with what you said about having an advantage with a higher refresh rate and the lower resolution. I'm about the immersion with smooth gameplay and high resolution.

your exactly right because i do that now. scaling down to anything thats not the recommended resolution looks like crap. Its playable but ugly compared to a native 1080p monitor. Hardware just isnt there for 144hz 1440p, not to mention the standard even isnt.