- Home /
Show Video from IP camera source.
Hello, everyone! I'm trying to display a video-stream (from ip-camera) via GUI.DrawTexture. For example, I used this video-url http://24.172.4.142/mjpg/video.mjpg?COUNTER (without any authorization) Solution below works fine for image-urls and I guess it works for ipcam-videos, but it doesn't (just a gray texture on screen). So, can anyone help me to solve this problem?
My Code(dragged *.cs onto MainCamera):
using UnityEngine;
using System.Collections;
public class pot : MonoBehaviour {
//public string uri = "http://24.172.4.142/mjpg/video.mjpg?COUNTER"; //url for example
public Texture2D cam;
public void Start() {
cam = new Texture2D(1, 1, TextureFormat.RGB24, true);
StartCoroutine(Fetch());
}
public IEnumerator Fetch() {
while(true) {
Debug.Log("loading... "+Time.realtimeSinceStartup);
WWWForm form = new WWWForm();
WWW www = new WWW("http://24.172.4.142/mjpg/video.mjpg?COUNTER");
yield return www;
if(!string.IsNullOrEmpty(www.error))
throw new UnityException(www.error);
www.LoadImageIntoTexture(cam);
}
}
public void OnGUI() {
GUI.DrawTexture(new Rect(0, 0, Screen.width, Screen.height), cam);
}
}
Answer by Whlam · Oct 28, 2016 at 12:57 PM
@LdDL, @dragadaga. Hi there! All Web Streams need to use Unity Coroutines. This code works like a charm for me. Cheers! =)
using UnityEngine;
using System.Collections;
using System;
using System.Net;
using System.IO;
public class WebStream : MonoBehaviour {
public MeshRenderer frame; //Mesh for displaying video
private string sourceURL = "http://server/axis-cgi/mjpg/video.cgi";
private Texture2D texture;
private Stream stream;
public void GetVideo(){
texture = new Texture2D(2, 2);
// create HTTP request
HttpWebRequest req = (HttpWebRequest) WebRequest.Create( sourceURL );
//Optional (if authorization is Digest)
req.Credentials = new NetworkCredential("username", "password");
// get response
WebResponse resp = req.GetResponse();
// get response stream
stream = resp.GetResponseStream();
StartCoroutine (GetFrame ());
}
IEnumerator GetFrame (){
Byte [] JpegData = new Byte[65536];
while(true) {
int bytesToRead = FindLength(stream);
print (bytesToRead);
if (bytesToRead == -1) {
print("End of stream");
yield break;
}
int leftToRead=bytesToRead;
while (leftToRead > 0) {
leftToRead -= stream.Read (JpegData, bytesToRead - leftToRead, leftToRead);
yield return null;
}
MemoryStream ms = new MemoryStream(JpegData, 0, bytesToRead, false, true);
texture.LoadImage (ms.GetBuffer ());
frame.material.mainTexture = texture;
stream.ReadByte(); // CR after bytes
stream.ReadByte(); // LF after bytes
}
}
int FindLength(Stream stream) {
int b;
string line="";
int result=-1;
bool atEOL=false;
while ((b=stream.ReadByte())!=-1) {
if (b==10) continue; // ignore LF char
if (b==13) { // CR
if (atEOL) { // two blank lines means end of header
stream.ReadByte(); // eat last LF
return result;
}
if (line.StartsWith("Content-Length:")) {
result=Convert.ToInt32(line.Substring("Content-Length:".Length).Trim());
} else {
line="";
}
atEOL=true;
} else {
atEOL=false;
line+=(char)b;
}
}
return -1;
}
}
Thank you for response. It fits my (and not only my, I guess) needs!
If you get Argument out of range exception, increase buffer size:
Byte [] JpegData = new Byte[100000];
Awesome thanks for the above code.... however I am getting an exception saying :
ArgumentException: no colon found
Parameter name: header
I have tried clearing the headers of the request before setting the credentials but no luck. The URL I am using also works in a standard browser (after prompting for username and password)
Any advice would be greatly appreciated
@Whlam @LdDL, @dragadaga. Script is working fine in unity but when i built apk and tried in android its not giving result.i mean i assigned "plane(mesh renderer)" as renderer value to "frame" variable in above script.it was suppose to display camera view on "plane" but its not working in android but working in unity.wondering what could be the reason.when i checked camera properties one thing i got to know that in case of android application ,video stream connection is not getting established with camera ip server.Need input i am new to unity and look like this is very complex issue 1)What format do most IP cameras stream feeds in ? What frame rates are they transferred at ? If i were to write a program that could read this feed and download it and store it on a server, will I get the same quality as a live feed ? 2)Will I need to again encode the ip webcam videostream data realtime to some other format (like flv.) before i display it on android?
@executer . In my case IP camera works on PC and on Android the same. Parameters of IP camera setup from URL request. Example: http://124.122.5.23/axis-cgi/mjpg/video.cgi?resolution=480x360&fps=15∁ression=30 This request must be described in documentation of IP camera Just in case my code -
using UnityEngine;
using UnityEngine.UI;
using System.Collections;
using System;
using System.Net;
using System.IO;
public class IPCamera : $$anonymous$$onoBehaviour {
[HideInInspector]
public Byte [] JpegData;
[HideInInspector]
public string resolution = "480x360";
private Texture2D texture;
private Stream stream;
private WebResponse resp;
public $$anonymous$$eshRenderer frame;
public void StopStream(){
stream.Close ();
resp.Close ();
}
public void GetVideo(string ip){
texture = new Texture2D(2, 2);
// create HTTP request
resolution = "320x240";
string url = "http://" + ip + "/axis-cgi/mjpg/video.cgi?resolution=" + resolution + "&fps=15&compression=30";
HttpWebRequest req = (HttpWebRequest) WebRequest.Create(url);
req.Credentials = new NetworkCredential("login", "password");
// get response
resp = req.GetResponse();
// get response stream
stream = resp.GetResponseStream();
frame.material.color = Color.white;
StartCoroutine (GetFrame ());
}
public IEnumerator GetFrame (){
while(true) {
int bytesToRead = FindLength(stream);
if (bytesToRead == -1) {
// print("End of stream");
yield break;
}
int leftToRead=bytesToRead;
while (leftToRead > 0) {
// print (leftToRead);
leftToRead -= stream.Read (JpegData, bytesToRead - leftToRead, leftToRead);
yield return null;
}
$$anonymous$$emoryStream ms = new $$anonymous$$emoryStream(JpegData, 0, bytesToRead, false, true);
texture.LoadImage (ms.GetBuffer ());
frame.material.mainTexture = texture;
frame.material.color = Color.white;
stream.ReadByte(); // CR after bytes
stream.ReadByte(); // LF after bytes
}
}
int FindLength(Stream stream) {
int b;
string line="";
int result=-1;
bool atEOL=false;
while ((b=stream.ReadByte())!=-1) {
if (b==10) continue; // ignore LF char
if (b==13) { // CR
if (atEOL) { // two blank lines means end of header
stream.ReadByte(); // eat last LF
return result;
}
if (line.StartsWith("Content-Length:")) {
result=Convert.ToInt32(line.Substring("Content-Length:".Length).Trim());
} else {
line="";
}
atEOL=true;
} else {
atEOL=false;
line+=(char)b;
}
}
return -1;
}
}
Thanks @Whalm for detail information but it was issue with playersetting and not with script..how stupid i am.i forgot to set internet Access to require in playersetting.its working now.But Thanks anyway :)
Answer by LdDL · Oct 12, 2016 at 06:47 AM
@dragadaga Yes, I'd just made my server to stream static jpeg-image every ~100ms (refreshing). Then I could set image to texture (like in example above), but URL looks like "http://1.1.1.1/home/jpeg_stream/cam001.jpeg"
Hello! Now I have a problem. after the start. picture hangs in a few seconds. how can I fix it? Thank you!
It is not recommended by manufacturer. $$anonymous$$in. period must be more than 1 second. From documentation: 2.6.1 JPEG Image (Snapshot) CGI Request The jpg/image.cgi is used to request a JPEG image (snapshot). A JPEG image (snapshot) should only be used when requiring less than 1 fps.
Script is working fine in unity but when i built apk and tried in android its not giving result.i mean i assigned "plane(mesh renderer)" as renderer value to "frame" variable in above script.it was suppose to display camera view on "plane" but its not working in android but working in unity.wondering what could be the reason.when i checked camera properties one thing i got to know that in case of android application ,video stream connection is not getting established with camera ip server.Need input i am new to unity and look like this is very complex issue
Does anyone actually have a working project they could share ?? I to am having this issue with IP Cameras. As I am not the brightest crayon in the box, it sure would be nice if someone had a sample project they could share :-)
Answer by dragadaga · Oct 12, 2016 at 06:32 AM
Hey. you solved the problem?
For a video you'll need a movie texture:
https://docs.unity3d.com/$$anonymous$$anual/class-$$anonymous$$ovieTexture.html
But it don't know if it also works with video streams.
Answer by franz909 · Jan 18, 2018 at 06:53 AM
Hey @Whlam
What kind of Layer did you use to texture the video on? I have the same problem as others, that there is just a grey field that is not displaying anything. I used a 'Cube' in the Frame Property of the Script.
Many Greets, Franz
Hi @franz909 Its doesnt matter what layer is and what type of mesh to use. Probably your camera has different type of stream. This example for mjpeg stream. See documentation for your camera
Thanks @Whlam for your respone!
I made it work in the Unity Player.
$$anonymous$$y Problem as follows: The target platform for that project is 'UWP' but a build for UWP is not possible because there is no 'HttpWebRequest' and '$$anonymous$$emoryStream' in UWP.
We tried to include the using directive 'Windows.' into the code but this leads to other issues.
Have you any experience in that? PS We want to use this on a $$anonymous$$icrosoft Hololense
Answer by dooleydragon · Jan 11, 2019 at 05:07 PM
@Franz909 @Whlam Hello, I'm also trying to port it over to the Hololens I'm receiving the same issue. Have you been able to find a solution? If not I've been able to send UDP packets to control a servo from the esp8266 Wi-Fi module and using the ArduCam OV2640 2MP Camera to capture video using this code in Unity. I will attempt to use the same coding structure and hopefully, that will bypass the compilation errors but still run on the HoloLens as it still can access libraries from the .NET framework. You can follow my progress on my blog www.arvibe.com.
I've posted this as an answer to my question back on Dec 2 of 2018 but it's awaiting moderation... don't know why? But here you go!
I was working on creating my own memory stream and stream.ReadByte() since I can't seem to get it to build for the HoloLens. I'm also using UnitWebRequest and custom DownloadHandlerScript. There are also some links in the resources that may already achieve this as well but some have not had much success with implementation.
But for a simpler answer for IP-cameras that are just sending JPEGS. I'm not sure if this is the best method but it works on the HoloLens and any other device that has build issues using what's previously mentioned.
1. image-based IP cameras only (ImageStream.cs)
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.UI;
//stream for image-based IP cameras
//Open Global IP's to try
//Busy Parking lot
//http://107.144.24.100:88/record/current.jpg?rand=509157
//Beach
//http://107.144.24.100/record/current.jpg?rand=291145
public class ImageStream : MonoBehaviour {
public string uri = "http://107.144.24.100/record/current.jpg?rand=291145";
public RawImage frame;
private Texture2D texture;
// Use this for initialization
void Start () {
texture = new Texture2D(2, 2);
StartCoroutine(GetImage());
}
IEnumerator GetImage()
{
yield return new WaitForEndOfFrame();
using(WWW www = new WWW(uri))
{
yield return www;
www.LoadImageIntoTexture(texture);
frame.texture = texture;
Start();
}
}
// Update is called once per frame
void Update()
{
}
}
Then you have IP cameras that continuously sending bytes of data through an open stream. Here I use UnityWebRequest instead of WWW in a MonoBehavior script and a custom DownloadHandlerScript to access the downloaded content and convert the bytes to get the 'content-length' and read the JPEG beginning byte "FFD8" and ending byte "FFD9" and load it into a RawImage Texture.
2. Streaming IP cameras (WebStream.cs)
using UnityEngine;
using System.Collections;
using UnityEngine.Networking;
using UnityEngine.UI;
using System;
//Open IP-camera address for nice beach in Spain
// http://212.170.100.189:80/mjpg/video.mjpg
public class WebStream : MonoBehaviour
{
//Custom download handler script
CustomWebRequest customWebRequest;
//Web request
UnityWebRequest webRequest;
[Tooltip("Set this to view the video")]
public RawImage frame;
[Tooltip("Set this to change the url of the streaming IP camera")]
public string url = "http://212.170.100.189:80/mjpg/video.mjpg";
[Tooltip("Set this to change the servo connection")]
public string servoURL = "SERVO-IP-ADDRESS";
[Tooltip("Set this to change the received data buffer increases lag")]
public int BufferSize = 100000;
[Tooltip("Connect button to connect to device")]
public Button ConnectButton;
[Tooltip("Connecting button status")]
public Button ConnectingButton;
[Tooltip("Disconnect button to disconnect from device")]
public Button DisconnectButton;
[Tooltip("IP address input field")]
public InputField InputIPAddress;
[Tooltip("Pass rate field decreases lag increases distortion")]
public InputField InputPassRate;
[Tooltip("Buffer rate input by user during runtime")]
public InputField InputBufferRate;
[Tooltip("Received data pass rate input by user during runtime")]
public InputField InputRxDataRate;
[Tooltip("Text to display pass percentage")]
public Text PercentageDisplay;
[Tooltip("Information panel to display results")]
public Image InfoPanel;
public Vector3 rightLeft;
public Vector3 upDown;
private string moveData;
byte[] byteArray;
byte[] bytes = new byte[100000];
//Connect button pressed
public void onConnect()
{
bytes = new byte[BufferSize];
if (InputIPAddress.textComponent.text == "")
{
return;
}
url = InputIPAddress.textComponent.text;
Debug.Log(url);
ConnectButton.gameObject.SetActive(false);
InputBufferRate.gameObject.SetActive(false);
InputRxDataRate.gameObject.SetActive(true);
customWebRequest = new CustomWebRequest(bytes);
GetVideo();
StartCoroutine(initConnecting());
}
//Waits 6 seconds for the first image to be downloaded then continuous streaming
IEnumerator initConnecting()
{
ConnectingButton.gameObject.SetActive(true);
yield return new WaitForSeconds(6);
if(customWebRequest.Connected == true)
{
ConnectingButton.gameObject.SetActive(false);
DisconnectButton.gameObject.SetActive(true);
}
else
{
ConnectingButton.gameObject.SetActive(false);
onDisconnect();
PercentageDisplay.text = "Please check connection!!!";
}
}
//Disconnects from source and dispose of request
public void onDisconnect()
{
InputIPAddress.textComponent.text = "";
DisconnectButton.gameObject.SetActive(false);
ConnectButton.gameObject.SetActive(true);
InputBufferRate.gameObject.SetActive(true);
InputRxDataRate.gameObject.SetActive(false);
PercentageDisplay.text = "change buffer rate only when disconnected";
webRequest.Dispose();
}
//Checks if bytes length is certain percentage of content-length to display image
public void onPassRateEdit(string passRate)
{
if (InputPassRate.text == "")
{
return;
}
float updateRate = float.Parse("." + InputPassRate.text);
customWebRequest.ImagePassPercent = updateRate;
}
//Sets buffer rate
public void onBufferRate(string BufferRate)
{
if (InputBufferRate.text == "")
{
return;
}
BufferSize = Int32.Parse(InputBufferRate.text);
customWebRequest.bufferSize = BufferSize; ;
}
//Sets received data rate
public void onRxDataRate(string passRate)
{
if (InputRxDataRate.text == "")
{
return;
}
float RxDataRate = float.Parse("." + InputRxDataRate.text);
customWebRequest.RxDataPassRate = RxDataRate; ;
}
//initialization
public void Start()
{
InputIPAddress.text = url;
}
//moves servos not (needed this was for IoT device)
public void move(string postData)
{
moveData = postData;
string data = "?move=" + postData;
UnityWebRequest www = new UnityWebRequest(servoURL + data);
www.Send();
if (www.isNetworkError || www.isHttpError)
{
Debug.Log(www.error);
}
else
{
// Debug.Log("Upload complete!");
}
}
//connect to URL and assign custom download handler script to the connections download handler ;
public void GetVideo()
{
webRequest = new UnityWebRequest(url);
webRequest.downloadHandler = customWebRequest;
webRequest.Send();
}
//update every frame
private void Update()
{ //update the information panel
if(!ConnectButton.IsActive())
{
PercentageDisplay.text = "Pass Rate: " + customWebRequest.ImagePassPercent * 100 + "%" +
"\nRx Pass Rate: " + customWebRequest.RxDataPassRate * 100 + "%"+
"\nConnected: " + customWebRequest.Connected +
"\nContent-Length: " + customWebRequest.contentLength +
"\nRx Data: " + customWebRequest.RxDataLength +
"\nJPEG: " + customWebRequest.JPEGsize +
"\nBuffer: " + customWebRequest.bufferSize +
"\nmoving: " + moveData;
}
//moves camera attached to servo's on IOT ip camera
if (Input.GetKey(KeyCode.A))
move("left");
if (Input.GetKey(KeyCode.D))
move("right");
if (Input.GetKey(KeyCode.W))
move("up");
if (Input.GetKey(KeyCode.S))
move("down");
if (Input.GetKey(KeyCode.LeftShift))
move("center");
/*
#elif UNITY_IOS
rightLeft = new Vector3(0, Input.acceleration.x, 0);
upDown = new Vector3(Input.acceleration.y, 0, 0);
if (Input.acceleration.x > .2f)
move("right");
if (Input.acceleration.x < -.2f)
move("left");
if (Input.acceleration.y < -.95f)
move("up");
if (Input.acceleration.y > -.75f)
move("down");
#endif
*/
}
//Close the connection if the application is closed
private void OnApplicationQuit()
{
if(webRequest!=null){
webRequest.Dispose();
}
}
//On focus display information panel to screen view.
private void OnApplicationFocus(bool focus)
{
if (focus == false)
{
InputPassRate.gameObject.SetActive(false);
InfoPanel.gameObject.SetActive(false);
InputBufferRate.gameObject.SetActive(false);
}
else
{
InputPassRate.gameObject.SetActive(true);
InfoPanel.gameObject.SetActive(true);
InputBufferRate.gameObject.SetActive(true);
}
}
}
3. Custom DownloadHanderScript (CustomeWebRequest.cs)
using System;
using System.Text;
using UnityEngine;
using UnityEngine.Networking;
using UnityEngine.UI;
//custom download handler request
public class CustomWebRequest : DownloadHandlerScript
{
/////////////////////////////////////////////////////////
/// Standard scripted download handler -
/// allocates memory on each ReceiveData callback.
//////////////////////////////////////////////////////////////////
GameObject Image;
public float ImagePassPercent = .50f;
public float RxDataPassRate = .50f;
public int contentLength;
public int JPEGsize;
public int bufferSize;
public int RxDataLength;
public byte[] RxData;
public bool Connected = false;
WebStream webStream;
Texture2D camTexture = new Texture2D(2, 2);
public float timeElapsed;
public int prevLength;
public CustomWebRequest() : base()
{
}
////////////////////////////////////////////////////////////////////
/// Pre-allocated scripted download handler reuses
/// the supplied byte array to deliver
/// data and eliminates memory allocation.
/////////////////////////////////////////////////////////////
public CustomWebRequest(byte[] buffer) : base(buffer)
{
}
////////////////////////////////////////////////////
/// Required by DownloadHandler base class.
/// Called when you address the 'bytes' property.
///////////////////////////////////////////////////
protected override byte[] GetData() { return null; }
///////////////////////////////////////////////////
/// Called once per frame when data
/// has been received from the network.
//////////////////////////////////////////////////
protected override bool ReceiveData(byte[] byteFromCamera, int dataLength)
{
RxData = byteFromCamera;
RxDataLength = dataLength; //return dataLength to webstream to display on screen.
bufferSize = byteFromCamera.Length; //return bufferSize to be displayed on screen.
if (byteFromCamera == null || byteFromCamera.Length < 1)
{
Debug.Log("CustomWebRequest :: ReceiveData - received a null/empty buffer");
return false;
}
contentLength = FindLength(byteFromCamera); // find the length of the JPEG
Debug.Log(contentLength);
if (contentLength == 0)
{
//Debug.Log("Not enough Bytes to read!");
byteFromCamera = new byte[bufferSize];
return true;
}
//string inputStream = Encoding.Default.GetString(byteFromCamera);
//inputStream = ToHexString(inputStream);
string inputStream = BitConverter.ToString(byteFromCamera).Replace("-", "");
//receieving array same size as bytes from camera
byte[] RecievedImageByte = new byte[inputStream.Length / 2];
//find the begining of the JPEG and trim excess
inputStream = inputStream.Substring(inputStream.IndexOf("FFD8")).Trim();
//if bytes contains end of JPEG find get begining and end
//else return true and new buffer
if (inputStream.Contains("FFD9"))
{
inputStream = inputStream.Substring(inputStream.IndexOf("FFD8"), inputStream.IndexOf("FFD9") + 4);
}
else
{
byteFromCamera = new byte[bufferSize];
return true;
}
//convert string to byte committing changes
if (StringToByteArray(inputStream) != null)
{
RecievedImageByte = StringToByteArray(inputStream);
}
else { return true; }
//We want the beginning of the JPEG file
if (inputStream.IndexOf("FFD8") == 0)
{
//Check to see if the stream and length are readable (Needs tuning)
if (imageReadable(contentLength, inputStream))
{
//create the image array with the length of the inputStream
byte[] completeImageByte = new byte[inputStream.Length];
DisplayImage(RecievedImageByte, completeImageByte, inputStream);
RecievedImageByte = new byte[inputStream.Length / 2];
completeImageByte = new byte[bufferSize];
inputStream = "";
}
else
{
Debug.Log("Image not readable!");
}
}
//return a new buffer
byteFromCamera = new byte[bufferSize];
return true;
}
//Call from WebStream to get content-length
public int ReceivedCameraData()
{
return contentLength;
}
////////////////////////////////////////////////////
/// Returns if the image is readable
///////////////////////////////////////////////////
public bool imageReadable(int contentLength, string inputStream)
{
float passRate;
float dataPassRate;
passRate = contentLength * ImagePassPercent;
dataPassRate = contentLength * RxDataPassRate;
if (inputStream.Length / 2 > contentLength || inputStream.Length / 2 < passRate || RxDataLength < dataPassRate)
{
return false;
}
return true;
}
//////////////////////////////
/// Display Image
/////////////////////////////
public bool DisplayImage(byte[] RecievedImageByte, byte[] completeImageByte, string inputStream)
{
//BlockCopy in place of Memory Stream
Buffer.BlockCopy(RecievedImageByte, inputStream.IndexOf("FFD8"), completeImageByte, 0, inputStream.Length / 2);
//We have to use GameObject.Find
//unless we instantiate the a prefab I'll add it later
Image = GameObject.Find("RawImage");
WebStream webStream = Image.GetComponent<WebStream>();
RawImage screenDisplay = webStream.frame;
JPEGsize = inputStream.Length / 2;
//Load bytes into texture
camTexture.LoadImage(completeImageByte);
Connected = true;
//Assign the texture the RawImage GameObject
screenDisplay.color = Color.white;
screenDisplay.texture = camTexture;
//return that it was successful
return true;
}
//////////////////////////////////////////////////////
/// Called when all data has been received
/// from the server and delivered via ReceiveData.
//////////////////////////////////////////////////////////////
protected override void CompleteContent()
{
Debug.Log("CustomWebRequest :: CompleteContent - DOWNLOAD COMPLETE!");
}
////////////////////////////////////////////
/// Called when a Content-Length
/// header is received from the server.
/////////////////////////////////////////////////
protected override void ReceiveContentLength(int contentLength)
{
Debug.Log(string.Format("CustomWebRequest :: ReceiveContentLength - length {0}", contentLength));
}
/////////////////////////////////////////////////////////////////////
/// Finds the length of the JPEG image received
///////////////////////////////////////////////////////////////////
public int FindLength(byte[] bytesReceived)
{
int position = 1;
int content = 0;
string inputStream = "";
//convert stream to string hex
inputStream = BitConverter.ToString(bytesReceived).Replace("-", "");
//find "h: " of "content-length: " and 6 postions after
//for content-length size
position = inputStream.IndexOf("683A20") + 6;
int contentToRead = inputStream.IndexOf("FFD8") - position;
if (inputStream.Contains("683A20"))
{
if (contentToRead > 0)
{
string contentLength = inputStream.Substring(position, contentToRead);
if (contentLength.Length > 32) { return 0; }
content = Int32.Parse(FromHexString(contentLength));
}
}
return content;
}
/////////////////////////////////////////////////////////////
/// Converts String back to a Byte Array.
///////////////////////////////////////////////////////////
public static byte[] StringToByteArray(string hex)
{
int count = 0;
while (hex.Length % 2 != 0)
{
hex = hex.Insert(hex.Length - 7, "0");
count++;
if (count == 5) { return null; }
}
int NumberChars = hex.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(hex.Substring(i, 2), 16);
return bytes;
}
//////////////////////////////////////////////////////////////
/// Converts String back to a Byte Array.
////////////////////////////////////////////////////////////
public static string FromHexString(string hexString)
{
var bytes = new byte[hexString.Length / 2];
for (var i = 0; i < bytes.Length; i++)
{
bytes[i] = Convert.ToByte(hexString.Substring(i * 2, 2), 16);
}
return Encoding.UTF8.GetString(bytes); // returns content length
}
public static string ToHexString(string str)
{
var sb = new StringBuilder();
var bytes = Encoding.Default.GetBytes(str);
foreach (var t in bytes)
{
sb.Append(t.ToString("X2"));
}
return sb.ToString(); // returns: "48656C6C6F20776F726C64" for "Hello world"
}
}
Some of the resources that helped in guiding development:
How to understand webresponse:
How to stream live footage from camera to unity3d:
https://stackoverflow.com/questions/39494986/streaming-live-footage-from-camera-to-unity3d
How to convert a byte array to a hexadecimal string, and vice versa:
Show Video from IP camera source:
https://answers.unity.com/questions/1151512/show-video-from-ip-camera-source.html
Unity3D Project displaying video from Mjpg stream:
https://github.com/DanielArnett/SampleUnityMjpegViewer/blob/master/Assets/Scripts/MjpegProcessor.cs
Streaming live footage from camera to Unity3D:
https://stackoverflow.com/questions/39494986/streaming-live-footage-from-camera-to-unity3d
I really hope this helps others in future developments with HoloLens, Unity3d, and IP cameras.
You can view the code and setup instructions on my Github or blog @ www.arvibe.com: