Introducing gst-launch-dynamicFri 05 April 2013 by Peter Ward
I thought it would probably be a good idea to write posts about the random projects I do. As it happens, this particular post is about a tool which still could do with more work, but is in a reasonable enough state to present to the world. I’ve got another tool which I’ll write a post about after doing a little clean up.
So, on to the entertainment. GStreamer is a multimedia framework, which lets you build applications which do all kinds of strange and wonderful things with multimedia (audio / video / anything you can stream).
I like to think of it as UNIX pipelines for multimedia: a GStreamer pipeline is built out of many elements (processes), and each element has a number of pads (stdin / stdout / stderr / other fds) which can be joined to other elements. Indeed, there’s even a nice tool called gst-launch which lets you express the pipeline in a UNIX-like syntax— in fact, it’s better, since it lets you connect any element to any other element, which is not possible (as far as I can tell) in the POSIX shell language.
What is it?
Great, so that’s what GStreamer is, what does my tool do? Well, it’s almost the same as gst-launch, except for one little thing: it lets you modify the pipeline at runtime. I think this is both amazing and slightly insane: it greatly lowers the barrier for ad-hoc experimentation with pipelines, and allows for easy scripting of pipelines.
How does it work?
It’s very simple. To run it, call it in the same way you would normally call gst-launch:
$ gst-launch-dynamic videotestsrc ! autovideosink
It will construct the pipeline, and start playing it, in this case showing the SMPTE colour bars. But, to let you change the pipeline, it also watches standard input for commands, so we can type this:
videotestsrc0.pattern = 1
This sets the pattern property on the videotestsrc0 element (since we didn’t give it a name, it gets automatically numbered) to 1, which corresponds to "snow" (i.e., black & white noise). As soon as you press enter, the property is set, and the video displayed on your screen will change.
It’s probably worth noting a bug (well, more of an inconsistency) here: the thing on the right hand side of the equals sign is interpreted as a Python object (using ast.literal_eval), which isn’t exactly the same syntax that the command-line arguments (and gst-launch) uses.
Aside from changing properties, you can control the state of the pipeline with these commands:
play pause stop
You can rewire the pipeline by adding, removing, linking and unlinking elements. This is a little unstable at the moment, since it’s just translating commands into GStreamer API calls without any sanity checking. It is possible to replace an element with another at runtime, but you need to be careful about unlinking and relinking the pads.
Link an element to another: element.sink -> element.src (in this case, both pad names are optional)
Unlink an element from another: element.sink x> element.src
Add an element: + type key=value …
Remove an element: - element
One immediate possibility is to write a program which sends these commands. A side note: during development, FIFO files are great for keeping the pipeline open while you modify your script. For example:
$ mkfifo commands.fifo $ gst-launch-dynamic … < commands.fifo $ ./my-random-script >> commands.fifo
As an example of how this can be completely useless, but very entertaining, here’s how to make any video psychedelic.
import time STEP = 0.01 DELAY = 0.01 def drange(start, stop, step): r = start while r < stop: yield r r += step while True: for value in drange(-1.0, 1.0, STEP): print 'vb.hue =', value time.sleep(DELAY)
$ python -u psychedelic.py | \ gst-launch-dynamic filesrc location=gitannex.ogv ! \ decodebin2 ! videobalance name=vb ! autovideosink
After I wrote this tool, I realised that nothing I had built so far really utilised its potential, so I really hope that someone will find this and build something cool with it.
Of course, you can’t do that if I don’t point you at the code!