C# TCP/IP Video Streaming.

  • Thread starter Thread starter m00n
  • Start date Start date
M

m00n

Guest
All, I'm pretty new to all this so I'm hoping to get some conceptual understanding as well as code help.


My server application is a Unity3D application that will connect to either a webcam on a PC or a phone's camera. Using Unity as it allows to easily target either Windows or Android with one code base. Unity3D is using .NET 4.x.

The Unity app connects to the webcam and is streaming over TCP/IP one picture at a time.


SERVER SENDING THE IMAGE

Here is where I'm sending the image. The image is coming in via the byte[] image parameter.

public void SendData( byte[] image )
{
if( _clientSocket != null && _clientSocket.Connected )
{
//_networkStream.BeginWrite( image, 0, image.Length, SentToClient, null );
//_networkStream.BeginRead( _buffFromClient, 0, 1, GetClientResponse, null );

_networkStream.Write( image, 0, image.Length );
_networkStream.Read( _buffFromClient, 0, 1 );
}
}



CLIENT GETTING THE IMAGE


//Kicks off the initial BeginRead from the server

private void StartStreaming( )
{
if( _client.Connected )
{
// Init the buffer and begin reading.
_buffer = new byte[ _client.ReceiveBufferSize ];
_serverStream.BeginRead( _buffer, 0, _client.ReceiveBufferSize, BytesReceived, null );
}
}

private void BytesReceived( IAsyncResult AR )
{
if( !_client.Connected )
return;

if( _buffer.Length > 0 )
{
try
{
Array.Resize( ref _buffer, _client.ReceiveBufferSize );

using( MemoryStream ms = new MemoryStream( _buffer ) )
{
if( ms != null )
{
System.Drawing.Image img = Image.FromStream( ms );
if( img != null )
picBox.Image = img;
}
}
}
catch( Exception e )
{
//string s = e.Message;
//MessageBox.Show( s );
}

// The server runs in "Block" mode.
_serverStream.Write( new byte[ ] { 1 }, 0, 1 );
_serverStream.BeginRead( _buffer, 0, _client.ReceiveBufferSize, BytesReceived, null );
}
}




The server seems to be operating just fine, the problem is on the client side that wants to use the image. After lots of reading I've learned that you can't guarantee that the full image will be sent all at once, that it may come in parts.

It that is the case, how do you accurately go about getting the full image? I would have thought that in the client, just calling Write(…) by putting it in blocking mode would have ensured that the full image was sent over the wire in once chunk.

If that's not the case, then how would I reconstruct my client code to async read the buffer from the network stream? When you I call

_serverStream.BeginRead( _buffer, 0, _client.ReceiveBufferSize, BytesReceived, null );

Won't that attempt to pull everything off the stream?



Rick

Continue reading...
 
Back
Top