The previous session was full of hopes, mainly because I got something working very quickly, but now I’m stuck.
I spent two weeks on a problem that I still did not manage to fully solve. Let’s talk about that.
The idea is to get a web browser working in VR, I do that to learn Unity / DayDream and VR in general. Also I want to build a working environment, where you can develop for the environment itself (this is the final goal). So I want a working web browser to start with.
But I’m far from here.
Pieces of the puzzle
Nothing that I tried has worked so far, I even restarted to test the feasibility of this project outside of Unity (native android) to check if it could be easier (and it is not). I spent hours on the Internet, looking for documentation and examples. I found that Samsung and Oculus got this working so it gave me some hope.
Oculus has released some code that could be a good start for me, I was very lucky because it was part of the Oculus SDK 1.0.3 and it disappeared on the next version (I don’t know why)
In this code, Oculus implemented a MediaPlayer (from the Android MediaPlayer class)
streaming to an Unity texture (as an example of their SDK), a good start for my project.
In the same time I discovered that any android view could be rendered into an OpenGL texture, thanks to Felix Jones who wrote a very good article about that.
That was it, all the pieces are available, it must be possible to use these to get my idea working!
The Oculus MediaPlayer in action
Example app from Felix Jones
Of course this is not so easy, first of all I don’t know much about Unity, JNI, C++ or even OpenGL (and OpenGL looks very complicated IMHO). I started to read and try to understand the code from Oculus and the blog post from Felix Jones, but it is more than I can handle I guess.
No problem, I asked for help on the Internet, even reached Felix Jones about his blog post (the Android view to OpenGL one). He was nice enough to respond to a few questions but in the end he told me that these information were too strategic for his company to help me more, I can understand, it is fair.
Learning the hard way
This is the technical workflow,
It is an Android app for Google DayDream (VR), it uses Unity as the 3D engine, Unity is backed by OpenGL to get things on screen. I use a plugin from Android to Unity to instantiate Android native views (TextView and WebView for example) then in Android I render these views on an OpenGL texture. This texture comes from a plugin written in C++ (most of this code comes from Oculus) this C++ code create an OpenGL texture and give the “texture id” to Android and Unity as well, it is the link between these two worlds.
- The C++ plugin creates the texture in the current Unity OpenGL context
- Java creates the view and render it in the OpenGL texture
- Unity render the texture (by its texture id) into the final object
I almost got the full process working, but before this result I encounter so much problems. The trickiest one that I discovered was that the Android’s WebView is hardware accelerated when created in Android, so it mess up the OpenGL context. It is important to remove any hardware acceleration on the Android’s views. From that point I decided to restart all my experimentations with a basic TextView first.
Today, the proof of concept is almost working, I only have a last problem, the texture that comes from the Android view (a TextView in my example) is only rendered as a 1px by 1px texture when it arrives in Unity (stretched into the object), it is strange, but because I’m a beginner, I’m sure I missed something basic somewhere. I will found a solution!
I did not have time to play with the Nintendo Wii controller as I wanted, so when I will get the WebView working I may look at that. Also I want people to get visual feedback about the real world around them, so I’ll try to get a live video stream from the camera into the VR world. I have much more idea but I need to go step by step because it is quite hard.
Documentation and articles
Checkout some code