Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Equivalent pipeline gst-launch-1.0 #22

Open
Tidus84 opened this issue Aug 6, 2020 · 2 comments
Open

Equivalent pipeline gst-launch-1.0 #22

Tidus84 opened this issue Aug 6, 2020 · 2 comments

Comments

@Tidus84
Copy link

Tidus84 commented Aug 6, 2020

Firsly, congratulations for your work! :-)

I'm working on a Basler camera and I'm studiyng your code in order to better understand gstreamer. In particularly, I'm interested to understand what's the equivalent pipeline in order to send a video streaming from my Basler camera...

E.g. If the demopylongstreamer ask me to launch the following command on the terminal of the receiver PC:

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink sync=false async=false -e

If I wanted to experiment the equivalent command in order to launch a sender on the terminal:

gst-launch-1.0 appsrc ! videoconvert ! x264enc speed-preset=1 ! video/x-h264,stream-format=byte-stream ! rtph264pay ! udpsink host=127.0.0.1 port=5000 sync=false async=false

There's appsrc, because I'm writing a simple sender with C++ and OpenCV.

Is that comand correct? Because I tried it, but it doesn't work and the gstreamer documentation is gaunt... :-(

Thank your in advance for your response! :-)

@frankSDeviation
Copy link

frankSDeviation commented Aug 29, 2020

Hello,

I am doing something similar as to what you wish to do. I could not get this program to completely work on my system but I was able to use the @joshdoe vision plugin. I can currently send two video streams with almost zero latency over UDP. The pipeline I am currently running for one camera is :

gst-launch-1.0 -v pylonsrc pixel-format=mono8 width=1600 height=1200 ! "video/x-raw,format=GRAY8" ! videoflip method=vertical-flip ! videoconvert ! x264enc bitrate=30000 speed-preset=superfast qp-min=30 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=5000

On the receiver PC, I am running this pipeline:

gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! queue ! decodebin ! videoconvert ! videoscale ! autovideosink sync=true

I hope this information helps! I am sure these pipelines will work on this program. This pipeline might work with the "parse" argument in the demopylongstreamer sample.

Cheers,
Frank

@MattsProjects
Copy link
Owner

Hi,
So sorry I am seeing this so late!!! :-(.
This program does not create a source plugin that can be used with gst-launch. It creates a fully finished application.
The gstreamer people really only intend for gst-launch to be used for testing/debugging (although I think most people make up pipelines with it and put them into some shell scripts to create the "application" :)).
For example, you can use gst-launch and gstreamer's test camera source to experiment with different pipelines for streaming across networks, saving to disk, etc.
Then once you have that backend pipeline as you like it, you can copy/paste the configuration into this project's sample code.
Then once you build it, you have a final binary you can run and ship, like a windows .exe

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants