from what i understand from the documentation is that it looks like it will run the (development) application natively on the PC but redirect the output to the tablet screen.
it then redirects the touch/accelerometer/Gyro/camera stream data from the tablet back to the application on the PC
Allowing you to compile and run your application local, but get a feeling of how it would look like on the mobile device without having to deploy.

is this what you are looking for gleach31 ?

i haven't seen it in the Qt development environment, and tried to look for it here: https://bugreports.qt-project.org but it doesn't exist yet as a request, no reason why you can't request it though

Maybe there is something that I don't know about Unity ... but it seems a bit weird what Unity does for debugging an app.
Sorry if may offend someone, but it seems stupid do this back-and-forth of datato debug something that does not really run on the target device.
It sounds to me that Unity does in this way because it cannot do a real debug on the device as Qt Creator allow.

So, my suggestion and try the debug mode of Qt Creator. And tell us if there is a real valuable difference on what Unity does with remote control.

Dear clogwog, thank you, that's the sort of thing I'd like. And so do over half a million Unity Remote users.

Gianluca - deploying to a target device every time you make a tiny little change (particularly if trying out different styles, or tweaking a font size, etc), is a ridiculous waste of time. Far easier to make changes on the main platform you like to edit on, and then see what it looks like on a tablet. It also allows you to try out other aspects, like sensors, which don't exist on a PC or Macbook, again without deploying.

"So, my suggestion and try the debug mode of Qt Creator. And tell us if there is a real valuable difference on what Unity does with remote control."