Meta

Tag: OpenShot

In This earlier posting, I had written, that although I had already deemed it improbable that the sort of Linux application will run on my Linux tablet, I would nevertheless try, and see if I could get such a thing to run. And as I wrote, I had considerable problems with ‘LiVES’, where, even if I had gotten the stuttering preview-playback under control, I could not have put LiVES into multi-tracking mode, thereby rendering the effort futile. I had also written that on my actual Linux laptop, LiVES just runs ~perfectly~.

And so a natural question which might come next would be, ‘Could OpenShot be made to run in that configuration?’ And the short answer is No.

‘OpenShot’, as well as ‘KDEnlive’, use a media library named ‘mlt’, but which is also referred to as ‘MeLT’, to perform their video compositing actions. I think that the main problem with my Linux tablet, when asked to run such applications, is that it is only a 32-bit quad-core, and an ARM CPU at that. The ARM CPUs are designed in such a way, that they are optimal when running Dalvik Bytecode, which I just learned has been succeeded by ART, through the interpreter and compiler that Android provides, and in certain cases, at running Codecs in native code, which are specialized. They do not have ‘MMX’ extensions etc., because they are RISC-Chips.

When we try to run CPU-intensive applications on an ARM CPU that have been compiled in native code, we suffer from an additional performance penalty.

The entire ‘mlt’ library is already famous, for requiring a high amount of CPU usage, in order to be versatile in applying effects to video time-lines. There have been stuttering issues, when trying to get it to run on ‘real Linux computers’, though not mine. My Linux laptop is a 64-bit quad-core, AMD-Family CPU, with many extensions. That CPU can handle what I throw at it.

What I also observe when trying to run OpenShot on my Linux tablet, is that if I right-click on an imported video-clip, and then left-click on Preview, the CPU usage is what it is, and I already get some mild stuttering / crackling of the audio. But if I then drag that clip onto a time-line, and ask the application to preview the time-line, the CPU usage is double what it would otherwise be, and I get severe playback-slowdown, as well as audio-stuttering.

In itself, this result ‘makes sense’, because even if we have not elected to put many effects into the time-line, the processing that takes place, when we preview it, is as it would be, if we had put an arbitrary number of effects. I.e., the processing is inherently slow, for the eventuality that we’d put many effects. So slow, that the application doesn’t run on a 32-bit, ARM-quad-core, when compiled in native code.

In this earlier posting, I criticized a non-linear video editor named ““, stating that it was unstable. I think I need to both lessen, and pinpoint my criticism of this software.

At the time I was mainly having problems with the Windows version of , which its developer worked long and hard to port to Windows. But the actual problem with under Windows has to do with an environment variable named ‘‘. When we install certain Linux-centric software under Windows, we are often instructed to set our ‘‘ variable to point to it. But with Qt-libraries, there is an additional variable named ‘‘, which states what folders the Qt-libraries are to be found in. This is a global variable under Windows, even though we may have different examples of software installed, that use different versions of the Qt-libraries.

The way it is with me, I have ‘‘ installed, which is also a Qt-based application which has been ported to Windows. Normally, a Windows executable will look in its own folder first, for an .DLL Files it needs, before looking in other directories.

But is an application that comes with many Qt-library-files, located in its own directory or directories. And so to point the executable to these libraries, the installer sets this environment variable.

The problem here is not strictly the fault of developers. With Qt applications, the slightest mismatch between the Qt version the application was linked against, from the shared libraries it will ink to again, when it is run, will cause typical error messages. This problem has already happened to many users, who install Qt-based applications under Linux, that were not compiled and linked in-stream with the packaged version of Qt. In fact, when we install certain Qt-based applications that are out-of-tree in this way, we often need to do something like this:

rm -f libQt*.so*

In the folder of the software in question, to force that software to link to the Qt-libraries we have installed globally, instead of linking to its own version of these libraries, before the software will run.

So in my case, when the time came for to ask me for my passphrases, in order to unlock my private keys, this library-mismatch prevented the software from working. At first this might sound like some sort of malware-problem, but is really just a library-incompatibility-problem.

And so the only way I was able to clean up this problem on my Windows 7 machine ‘‘, was to uninstall the Windows version of , which its authors provided so carefully, and to delete this environment variable as well. Apparently, if this variable is set and the folder empty – or nonexistent – this is not enough to convince a Qt-application to link to its own Qt-libraries. in my case I needed to delete this variable as well, before became stable again.

Now, if a user only feels that he will be running software on his Windows computer, that uses one, global Qt-version, this could all look very different. But in my usage-scenario,

The way I set up my Windows machine is different, and

I have access to on my Linux machines, for which reason I do not need this software under Windows.

I think that I had the additional problem on the machine I name ‘‘, that once crashed my X-server, when instructed to play back a corrupted video file, in its preview window, but at full quality. Yet, I have already learned that ‘‘ has a weak, fragile instance of the X-server running… So this may also not strictly be the fault of developers.

On built-up Linux computers, we have two important nonlinear video editing applications available from the package manager: “” and ““. As the naming would suggest, is centered on the K-Desktop-Manager, while is not. Both use the video processing library as their back-end.

(Edit: I goofed, first when I attempted to transcode the video using , and then again, when I wrote this posting about it.

What I had expected to find in the Project Settings of , was that the two “NTSC” formats be grouped together somehow, similarly to they are with , starting with “NTSC”. Instead, has a long drop-down list, which is alphabetized by the software itself – not the programmers.

It has the setting simply named “NTSC”, which refers to the older, 4:3 version. But then, when the user wants to find the 16:9 version, he must scroll way up along the drop-down list, until he gets past the “HD” entries, all the way to the entry “DV/DVD Widescreen NTSC”. And then that setting will perform as expected.

therefore does not have this weakness, but my mistaken choice of output formats, was in fact the true reason for the malformed video file that resulted. )

There is a specific feature in , which does not work properly. To understand what is expected of this feature, the reader must first of all understand something about the background of analog video.

Back in those days, the NTSC, PAL or SECAM signal format assumed a 4:3 aspect ratio, which was later translated into a digital equivalent, which in turn simplified the resemblance to analog, by assuming a pixel aspect of 1:1. This first resulted in the VGA format. In other words, in order for a rectangle of square pixels to have a ratio of 4:3, it would have needed to be a 640×480 resolution image. Because of the way analog signals were processed in practice however, an analog video signal could never really distinguish anywhere near 640 lines of horizontal resolution. 483 vertical scan-lines were smeared horizontally by the filters, to result in maximally 480 horizontal points in the case of NTSC, and that would have been, assuming a comb-filter was used, which was also not available in the early days of TV. In reality, when the signal-format was adapted to DVDs, a strict NTSC format was defined, that consisted of 486 vertical scan-lines, but which had 720 points horizontally, a pixel-aspect of 8:9, and the frame-rate was kept consistent with the 29.97 Hz the analog signal had. This latter deal may in fact be important in video editing, because if an incorrect frame-rate is combined with correct sound-compression from today, an eventual loss of synchronization may result, after longer intervals of play.

At some point in time, picture-aspects of 16:9 started to become popular with DVDs, and their format was adapted to this, by making the pixel-aspect 32:27, and maintaining 720 horizontal points. The question follows, how this change in aspect-ratios was sent along to an analog TV, and the answer would lie in the fact that analog TVs had several modes with which to display a 4:3 signal on their 16:9 physical screens, one of which was to letter-box, another of which was to oversize, and another was just ‘to stretch to make it fit’. Viewers would simply observe that this last option sometimes produced distorted results and sometimes not.

I.e., Nothing was done to communicate the meta-data; the picture was simply stretched to 16:9 on-demand, by the viewer settings.

The sad fact about , is that somehow, it does not work with this notion correctly. As many video-editing applications may do, has a set of presets which the project is formatted to, as possible output-formats, and one of them is the Strict NTSC. This setting assigns some number of pixels, but still assumes an aspect ratio of 4:3, even when the user wants 16:9.

OTOH, has a clear setting for NTSC, but in 16:9 instead of in 4:3. I have not tried them with PAL.

This failure of to process its meta-data correctly, has already slowed down one of my projects recently. I had a number of arbitrarily-formatted MP4 videos, and felt at first that the easiest way to transcode those into NTSC – 16:9, would have been just to open each in , but to set the output format to Strict NTSC. And the results were MPEG-2 -compressed streams which other, external player-applications could not play back.

Under the circumstances this was not a major setback, because I found an appropriate set of ‘‘ command-line-recipes, which transcoded everything as I needed it. But a full-featured video-editing application needs to be able to make this format one of its targets.

In a previous blog posting, I had reported that OpenShot was dangerously unstable, and even unstable under its native Linux. I’m basing this on OpenShot 1.4.3, installed from the package manager under Debian / Jessie, with a KDE 4.14 desktop.

Firstly, I have changed the Output Mode, with which this application renders its previews, from “sdl” to “sdl_preview“.

More importantly there seems to be a detail in its practical use, which I was unaware of before. Earlier, I had imported a captured .OGG / .OGV file into its video clip resources. In itself this presents a problem, in that certain .OGV Theora files, especially ones produced by screen capture, are known to give this program problems. This can be anticipated, by the video clips in question having blank thumbnails.

On the first try, I told OpenShot to play the video clip anyway in its preview window. The progress bar went from the beginning to the end at the correct speed, but once it reached the end, the application became unresponsive and KDE had to shut it down forcibly. This was with the Output Mode still set to “sdl“.

Apparently, once OpenShot has crashed, it has saved corrupted information into the folder ‘~/.openshot‘ . Had I known this, I could have deleted that folder completely before trying out the application again. But instead I tried to use OpenShot again right away, sometimes telling it to play the .MP4 version of the same video or other clips.

That was when my desktop froze. The X-server did not crash, but no movement or mouse input could be given anywhere on the desktop. The actual mouse pointer was still moving in response to the mouse however, and it was also changing from the usual pointer, to the special pointer when hovering over the preview panel. I needed to use <Ctrl>+<Alt>+F1 in order to open a Virtual Terminal in text mode, and then to ‘kill -9‘ the actual application pinning the other virtual terminal.

My setups typically allow for multiple sessions to be logged in at once, and Virtual Terminals 1-6 are text-based, while the graphical ones start from Virtual Terminal 7. So once the offending processes were killed, I was able to <Ctrl>+<Alt>+F7 back into the graphical session, which was not corrupted.

But the way I finally broke this spell with OpenShot, was to delete the directory ‘~/.openshot‘ and its corrupted contents. After that, the application was able to play the .MP4 video clips fine, which also had thumbnails that corresponded to their content.

Also, if I decide that a user-space configuration is needed to ensure the stability of the system, I copy the contents of the configuration folder for one application, into ‘/etc/skel‘, from where a skeleton of starting files is copied, every time the home directory of a new user is set up. That way, any newly-installed user will inherit my recommended settings.

In order to do that, after I deleted the config folder once, I only launch the application briefly, and make my configuration settings, before exiting the application again immediately. At that point I feel that the config folder is in its correct state, to be copied to ‘/etc/skel‘ .

There are certain purposes, which OpenShot may be better able to suit than alternatives, simply because OpenShot has more features.