Test- Start Screen – Main Screen – End Screen

This slideshow requires JavaScript.

The project made Unity3D is going to be projected on the wall screen on the Wirelab  location.

For the Final Prototype there are is the Stand-By Scene(with the Push The Button Video) that transits to the Main Scene and when the counter in the Main Scene reaches 60 seconds the project will transit to the End Scene(which after 10 seconds of giving information to the user about the website and performance, it will transit back to the Stand-By scene) .

Advertisements

Functionalities Stand

This slideshow requires JavaScript.

In the support for the button we have also included a Arduino board on which is connected a buzzer, 9 leds and a Ultra Sonic Sensor. The purpose of this is to detect the approach of the people around and brighten up the support to catch the trespassers attention.

The distance range of the Ultra Sonic Sensor is set maximum to the 45 cm by default. For our installation we need at least 1 m. To fix this problem, I have found some tips and ideas regarding this topic by searching on forums & tech websites (EX: http://www.instructables.com/id/Improve-Ultrasonic-Range-Sensor-Accuracy/) .

 

Test- Scene Transit

0001.png

For the stand-by stage to attract people to use our installation, I’ve made another scene in Unity3D that relies on  the GUI principle. When a keyboard button is being pressed the next level will start using the build settings.

This slideshow requires JavaScript.

But instead of the keyboard button to be pressed, I’ve programmed it so when the wireless button is pressed will start the Main Scene.

The Push Button video is integrated on the plane as a MovieTexture (https://docs.unity3d.com/Manual/class-MovieTexture.html) and is looping till the button is being pressed. In the Main scene with the cubes that react to movement and play sound the Button doesn’t have any effect, thus not interfering with the user interaction anymore.

Test 03: Unity-Arduino- Wireless Button

In this test the wireless button is connected to the platform in the scene. So, when the button is being pressed the main platform will rise up and the countdown will start, thus starting the interaction.

We haven’t implemented this function in the final prototype, because it wasn’t  feasible enough with the concept and the overall installation.

Test- Projection Mapping

Since in the main idea of the concept we decided to use projection mapping, I’ve tried to find several solution on how we can integrate that in our installation.

I am using a laptop that runs on Windows, I’ve found how I connect Madmapper and Unity3D through a plugin that is called Whyphon. By just assigning the plugin and changing some details on the connection and ports I was able to project on objects from the Scene.

But the computer installed in the Wirelab that we are going to use, is running on Mac. For the Mac OS X the projection mapping on objects in Unity3D can be done using Syphon .

We encountered some problems connecting MadMapper to Syphon and Unity3D, but this will be workable till next week.

In case this will be unsuccessful, the other plan is to use another Unity3D project that will focus mainly on projecting a playful animation on the ground that will also react to movement.

 

Test 02: Unity-Arduino- Buttons signal

In this experiment we’ve tested the reaction time of the buttons(which are linked to the Arduino) in connection with the 3 cubes made in processing. So we can proceed with the wireless button tests.

The 3 cubes are moving to the right when the green button is being pressed and to the left when the black button is being pressed. On the release of the button the cubes are having a static position, not moving anymore.

Test 01: Unity- Arduino

In this test I’ve connected a potentiometer to the Arduino in order to test the connection between Arduino and Unity. The box is reacting to the rotation of the Potentiometer.

The second purpose of this test was to give the opportunity to the user to change the genre of music beats with visualization on the main screen.

We ended up not implementing this functionality in the installation, because we agreed on not using sound from different genres but rather stick to a more abstract feeling.