Gamify The Real Mobile Tank - Augmented Reality Update

In our previous demo i.e. Gamify The Real Mobile Tank. We showed how we can control movement of a real mobile tank by moving a virtual tank in a Unity game. It was done by establishing a network connection between virtual tank in game & real mobile tank, using Raspberry Pi and Motor Driver. A python script was running on Raspbian OS at Raspberry Pi.  Unity game, running on another laptop, sends message about movement of virtual tank in game on wifi network to the python script. The python script reads the command and sends signals to the Motor Driver attached with Raspberry Pi. The motor driver is connected with real tank, and as per the signals received from Python script, it moves the tank.

 

Now in this video, we are taking this research a step ahead, we are introducing a real hurdle in front of the real mobile tank using Augmented Reality. And as soon as the hurdle will come up in front of real tank, the same hurdle would come up in front of the tank in the game. Interesting? Isn’t it? The framework that has been added in the demo to make it possible is Apple ARKit. It is the Image Recognition API of ARKit that is making this possible. 

 

An iPhone 7 has been mounted on Real Mobile Tank, to work as eye of the tank.  How? Let us understand this in more detail. An ARKit app is running on iPhone 7 that is using the iPhone’s camera to identify an image. As soon as the app identifies the image, it sends the message of hurdle detection to the Unity game, running on laptop (remember the Unity game in previous video) through wifi network. And the similar hurdle is made in Unity game.

 

List of Components

 

 

 

 

 

Download Tank Game:

https://assetstore.unity.com/packages/essentials/tutorial-projects/tanks-tutorial-46209?aid=1100l7wkp

ARKit:

https://developer.apple.com/documentation/arkit

Advertisements

Workflow

How to use Augmented Reality in Robotics

C# Source Code for detecting image & sending command to tank game

For tank controlling code goto this page

using System;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.ARSubsystems;
using UnityEngine.XR.ARFoundation;
using System.Net.Sockets;
using System.Text;


[RequireComponent(typeof(ARTrackedImageManager))]
public class TrackedImageInfoManager : MonoBehaviour
{
    private TcpClient socketConnection = null;

    private void ConnectToTcpServer()
    {

        try
        {

            socketConnection = new TcpClient("192.168.0.104", 22002);
            Debug.Log("socket created..");

        }
        catch (Exception e)
        {
            Debug.Log("On client connect exception " + e);
        }

    }

    [SerializeField]
    [Tooltip("The camera to set on the world space UI canvas for each instantiated image info.")]
    Camera m_WorldSpaceCanvasCamera;

    
    public Camera worldSpaceCanvasCamera
    {
        get { return m_WorldSpaceCanvasCamera; }
        set { m_WorldSpaceCanvasCamera = value; }
    }

    [SerializeField]
    [Tooltip("If an image is detected but no source texture can be found, this texture is used instead.")]
    Texture2D m_DefaultTexture;

   
    public Texture2D defaultTexture
    {
        get { return m_DefaultTexture; }
        set { m_DefaultTexture = value; }
    }

    ARTrackedImageManager m_TrackedImageManager;

    void Awake()
    {
        InvokeRepeating("SendMessage", 2.0f, 2.0f);

        if (socketConnection==null)
        ConnectToTcpServer();
        m_TrackedImageManager = GetComponent<ARTrackedImageManager>();
    }

    void OnEnable()
    {
        m_TrackedImageManager.trackedImagesChanged += OnTrackedImagesChanged;
    }

    void OnDisable()
    {
        m_TrackedImageManager.trackedImagesChanged -= OnTrackedImagesChanged;
    }
    string dtTxt = "notdetected";
    string prevdtxt = "notdetected";

    void UpdateInfo(ARTrackedImage trackedImage)
    {
        // Set canvas camera
        var canvas = trackedImage.GetComponentInChildren<Canvas>();
        canvas.worldCamera = worldSpaceCanvasCamera;

        // Update information about the tracked image
        var text = canvas.GetComponentInChildren<Text>();
        text.text = string.Format(
            "{0}\ntrackingState: {1}\nGUID: {2}\nReference size: {3} cm\nDetected size: {4} cm",
            trackedImage.referenceImage.name,
            trackedImage.trackingState,
            trackedImage.referenceImage.guid,
            trackedImage.referenceImage.size * 100f,
            trackedImage.size * 100f);
        if (trackedImage.trackingState == TrackingState.Limited || trackedImage.trackingState == TrackingState.None)
            dtTxt = "notdetected";
        else if (trackedImage.trackingState == TrackingState.Tracking)
            dtTxt = "detected";

        var planeParentGo = trackedImage.transform.GetChild(0).gameObject;
        var planeGo = planeParentGo.transform.GetChild(0).gameObject;

        if (trackedImage.trackingState != TrackingState.None)
        {
            planeGo.SetActive(true);

            trackedImage.transform.localScale = new Vector3(trackedImage.size.x, 1f, trackedImage.size.y);

            // Set the texture
            var material = planeGo.GetComponentInChildren<MeshRenderer>().material;
            material.mainTexture = (trackedImage.referenceImage.texture == null) ? defaultTexture : trackedImage.referenceImage.texture;
        }
        else
        {
            planeGo.SetActive(false);
        }
    }
    void SendMessage()
    {

        if (dtTxt != prevdtxt) { 
        try
        {
            NetworkStream stream = socketConnection.GetStream();
            if (stream.CanWrite)
            {
                             
                byte[] clientMessageAsByteArray = Encoding.ASCII.GetBytes(dtTxt);
                stream.Write(clientMessageAsByteArray, 0, clientMessageAsByteArray.Length);
                Debug.Log("Client sent his message - should be received by server");
            }
        }
        catch (SocketException socketException)
        {
            Debug.Log("Socket exception: " + socketException);
        }
            prevdtxt = dtTxt;
    }
    

    }
    void OnTrackedImagesChanged(ARTrackedImagesChangedEventArgs eventArgs)
    {
        foreach (var trackedImage in eventArgs.added)
        {
            // Give the initial image a reasonable default scale
            trackedImage.transform.localScale = new Vector3(0.01f, 1f, 0.01f);

            UpdateInfo(trackedImage);
        }

        foreach (var trackedImage in eventArgs.updated)
            UpdateInfo(trackedImage);

        foreach (var trackedImage in eventArgs.removed)
            UpdateInfo(trackedImage);


    }
}
 

Advertisements

Add Your Comments

Ram Gopal

Amazing Experiment.

Advertisements