- Home /
Send RenderTexture via TCP or UDP
I am using Unity as the back-end game engine that sends camera views to a C# application that simulates a control console. The C# console can send information back to change camera movements and switch active cameras. The console will have 8 320x240 and 2 640x480 views of the game running in Unity. I am looking for some information on a general approach I should take to achieve this. Here's what I've tried so far along with the results:
Approach 1: I have 10 cameras in Unity rendering to texture. The RenderTextures are sent via UPD to the C# app. Since the test was done with 128x128 RenderTextures everything works fine and the game runs at 20-30 fps. Since UPD as a packet limit I can't really go much higher than 128x128. I could continue with this method by sending multiple images to represent one frame and then stitch them back together in the C# app. I think I will start hitting performance issue when I tray to get 8 320x240 and 2 640x480.
Approach 2: I have the same Unity game sending out RenderImages via TCP. The game starts to run really slow when I send out one 512x512 image.
TCP Code to send out images:
using UnityEngine;
using System.Collections; using System;
using System.Net; using System.Net.Sockets; using System.IO; using System.Threading;
public class TCP_ImageSend : MonoBehaviour {
public RenderTexture sendRenderTextureMain;
private Texture2D sendImage2DMain;
byte[] data = new byte[0];
int sent;
IPEndPoint ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 9050);
Socket server = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
void Start ()
{
sendImage2DMain = new Texture2D(sendRenderTextureMain.width, sendRenderTextureMain.height, TextureFormat.RGB24, false);
try
{
server.Connect(ipep);
}
catch (SocketException e)
{
Debug.Log(e.ToString());
}
}
void Update () {
RenderTexture.active = sendRenderTextureMain;
sendImage2DMain.ReadPixels(new Rect(0, 0, 640, 480), 0, 0);
sendImage2DMain.Apply();
sent = SendVarData(server, sendImage2DMain.EncodeToPNG());
}
private static int SendVarData(Socket s, byte[] data)
{
int total = 0;
int size = data.Length;
int dataleft = size;
int sent;
byte[] datasize = new byte[0];
datasize = BitConverter.GetBytes(size);
sent = s.Send(datasize);
while (total < size)
{
sent = s.Send(data, total, dataleft, SocketFlags.None);
total += sent;
dataleft -= sent;
}
return total;
}
}
Approach 3: Make the game networked so that each instance of the game can take the load of rendering to texture and sending out network streams. Networking the games like this I would assume would be and alternative to threading.
The main issues I'm running into are performance due to the time it takes to send the images stream through the network. If I understand it correctly, Unity can not put the RenderTexure and network streams onto a different thread.
Is there something I need to do with the code to optimize the image processing and network streaming?
I'm generally trying to see if I'm heading down the right path. How do I get 8 320x240 and 2 640x480 camera views out of Unity without the game running at 2fps? Perhaps there are better ways and approaches for what I'm trying to do.
Answer by DaveA · Mar 21, 2012 at 08:18 PM
I'm dealing with a couple of similar issues. In one project, I'm monitoring live video from a factory (the game simulates the factory, and you can access the live feed when you navigate around the 'game' version). In this case, we just grab frames from the video server, by detecting the effective resolution they will be displayed at, asking for that size, but no larger than a user-defined max (640x480 biggest we dare), and telling it JPEG with about 75% quality.
The other is sending webcam between two Unity's. In this case we really need good frame rate and quality, and that means good compression. Look at JPEG, MJPEG, and h.264 (the latter is preferred, but tricky to implement on either end).
So my message is: PNG may not be the best choice. You really need to compress that stuff as much as possible, and never send more pixels than will be viewed. Look into OpenCV. You may want to put the conversion and/or 'send' in their own threads too. Use a multi-core processor for sure.
If you have the means, probably approach 3 would be best, you can eliminate graphics rendering as a bottleneck that way.
Dave A. thank you for the quick response.
I will look into JPEG, h.264, and OpenCV.
I think I tried putting 'send' on another thread. Unity gave and error saying that 'send' needs to be done on the main thread. I will look at this again after the image compress.
I appreciate your help and suggestions.
Answer by Thirdblood · Mar 26, 2012 at 07:07 PM
I am doing something similar as well. I am sending live video from one of my six unity cameras to a separate QT application using TCP. I have found the only way to remove the lag that occurs in Unity when taking progressive screenshots (about 1 every 0.1 seconds) is to use a small render texture (i am using 256x256). I then send all the pixels over TCP and reconstruct the image at the other end.
Let me know if you know figure out a better way.
Hello. i am trying to do the same, can you please share the code that you have developed. it would help me a lot.
Answer by prs6 · May 21, 2012 at 06:48 PM
Hello, i'm new at the Unity programming environment. The code proposed, in what kind of file should be copied? And how can i launch this code? Thanks a lot...
Answer by prs6 · May 21, 2012 at 05:25 PM
Hi everybody I'm very new to the Unity world, and i need to do exactly what your are discussing about. However, i don't know where must i paste the code proposed for TCP sending images. What kind of file, where... and how can i launch it. Thanks a lot. Best regards...
Your answer
Follow this Question
Related Questions
How do I identify if the packet arrived by UDP or TCP protocol 1 Answer
How to receive streaming data over a UDP connection? 1 Answer
UDP socket impact on Gear VR overheating 0 Answers
What is UDP port 52326 used for? 1 Answer
TCP Problem 0 Answers