https://software.intel.com/pt-br/forums/topic/328768/feed
pt-brGood day.https://software.intel.com/pt-br/comment/1712276#comment-1712276
<a id="comment-1712276"></a>
<div class="field field-name-comment-body field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even" property="content:encoded"><p>Good day.</p>
<p><strong class="quote-header">Citation :</strong><blockquote class="quote-msg quote-nest-1 odd"><div class="quote-author"></div><br />
I'm really not an expert at all, but wouldn't it be better to compress it as a video?<br />
</blockquote><br />
This certainly can give better compression ratio.</p>
<p><strong class="quote-header">Citation :</strong><blockquote class="quote-msg quote-nest-1 odd"><div class="quote-author"></div><br />
So for example using something like H264.<br />
</blockquote><br />
I don't know your hardware resources but software h264 compression is not fast. You may not be able to maintain it in real-time. It is reasonable to consider hardware h264 encoding (e.g. Intel MediaSDK).</p>
<p><strong class="quote-header">Citation :</strong><blockquote class="quote-msg quote-nest-1 odd"><div class="quote-author"></div><br />
But the question is - is it possible to get live camera frames, compress them as a video (using UMC), send this data manually via TCP or UDP (WinSock) and then on the client application get somehow the individual uncompressed frames?<br />
</blockquote><br />
You can send and receive data any way you like, you just need to ensure its order and consistency on far-end. Common way to transmit video streams over network is by means of RTP <a href="http://tools.ietf.org/html/rfc3550">http://tools.ietf.org/html/rfc3550</a>.<br />
Camera -&gt; encoder -&gt; RTP packetization -&gt; network transfer -&gt; depacketization -&gt; decoding -&gt; rendering.</p>
<p><strong class="quote-header">Citation :</strong><blockquote class="quote-msg quote-nest-1 odd"><div class="quote-author"></div><br />
The point is that we are using wi-fi and we need to transmit as little data as possible, in real time (or with as low latency as possible). And also, we have to deal with cases when the wi-fi signal is quite low which in the current implementation means incredibly low framerates, because TCP tries to transfer the whole frame at all costs :) I would prefer a decrese in quality of the video over decrease in framerate.<br />
</blockquote><br />
You can split stream on chunks, usually it is better to keep them as small as MTU in current network to avoid packets splitting during transfer.<br />
Bit-rate on server side can be changed depend on packet losses and channel bandwidth.<br />
Latency during encoding and decoding usually depends on overall encoder speed, amount of reference frames, restrictions on frames reposition and additional features such as threading.</p>
</div></div></div>Fri, 19 Oct 2012 11:04:58 +0000mad\pvlasovcomment 1712276 at https://software.intel.com