I want to learn VR, mainly because I think there is something going on, I have a big project in my head and I will use it to learn VR: check my previous post
What I managed to do
Installation and hardware
The installation of Daydream by Google has been quite easy, but the documentation is already obsolete and you will have to adapt things here and there.
Regarding the hardware part, I bought a Nexus 6P phone for the headset and I have a Nexus 5 that I use for the remote. I already had a Cardboard (in plastic) and I use my Bluetooth keyboard for this experiment.
I watched a few video to learn the basic with Blender and to be able to create a Keyboard in 3D, it was quite easy because I limited myself to basic cube for each key. Then named each keys with the name used in Unity, so I can access it easily in Unity Script (C#).
For the other point, I will learn on the go.
I started with the demo from Google, so I did not have to create a new scene from scratch, the remote worked quickly and I could move the cube from the demo.
Then I integrated the 3D models from Blender, "Snowman" and "Rabbit" that I had created in the video tutorials, I added that was needed in the code, so I can move these objects too. Fun.
I also added a block to symbolize the remote - Nexus 5 - so it was now physically present in the scene (because in the basic Google demo there is only the pointer) and I code a little so the block moved on the 3 axes.
Eventually I imported the full 3D Keyboard in Unity and added a text field in the scene. After some time looking at the documentation of Unity (not easy to find on that topic) I succeed to get the events for each key, and so I could add a visual feedback on the 3D model: when you push a key from the real keyboard, the virtual one reacts.
Finally, the result is nice but not as I had in mind because of a limitation coming from the Daydream platform.
What I learn
Unfortunately I arrived to a limitation of the Daydream platform (in the way I use it): tracking of the position in space is not possible. If the user moves, or if he moves the remote in space, it is not possible to get those positions in the virtual environment. For this to be possible, we need externals trackers like the Vive use for example. It is a well known limitation that I forgot.
Step by step
The second step will be to have a working web browser and use the keyboard and the remote to surf the web.
I will experiment with a wiimote (trying to connect it first) and also I want to get a live video stream to do a feedback screen from the real world.
Unfortunately I did not found a way to get a video of the app, most of the capturing apps do not work with VR on. On this screen you can see the keyboard with no key pressed. The keys do not have texture (lazy me), the key change to orange when you use it. On the right the remote and in the back the pointer (orange). There is also a floating screen with the text you enter with the keyboard.