Monday, December 14, 2015

On in AWE I experienced the world s first eye tracking VR display

"Editor's note" this article from the code Nada feeds, following to the industry first-hand travel Doc – Daniel Oliver Kreylos records.

On in AWE, I experienced the world's first eye-tracking VR display

2013 I had participated in a world exhibition (Augmented World Expo,AWE), I was ganged up on AR the oceans. Now it is different! AWE started and UploadVR to host this Conference. But this time, I'll bring you some hands-on experience of VR display my records.

eMagin 2k×2k VR HMD

EMagin that head-mounted display devices are not yet named me to the main objective of this AWE. Before this there have been some views and details, but I have been declared suspicious 80 ° x80 ° viewing angle. And Oculus Rift, and HTC/Valve Vive or other similar head-mounted display devices, eMagin's head was based on OLED microdisplays (microdisplays are their core business), head-mounted display devices based on the micro display, including the eMagin Z800 3DVisor their launch, in terms of viewing angles are less than satisfactory, the General maximum of no more than 40 °. After all, the area only about a square centimeter size of a screen is enlarged to cover most of the Visual field, than required for the display of the zoom a few inches of optical lenses is much more complex.

On in AWE, I experienced the world's first eye-tracking VR display

Didn't think my suspicions are wrong, and eMagin with their argument a truism. I tried this, and ate breakfast with their Project Manager Dan Cui and a in-depth and friendly talks. So, details how? First and foremost is that this is just a very early prototype, not the development version, but not consumer products. At present it is heavy (close to 500 grams? ), Most of the weight is in the front part of any kind of sensor is not installed; there is no tie wraps; screen not the final form (details below). Dan mentioned that Developer Edition final release date, they wanted to be done in the fourth quarter of next year.

So, optical lens part of this equipment is very surprising, but its overall shape and design makes me want to wear in public. Of course, on the premise of its augmented reality mode.

It has a typical cyberpunk style.

We begin with the display system say. It is monocular 2kx2k OLED micro display panel (0.63 x0.63 ″), as well as extremely high pixel fill rate (no screen door effect). In the current prototype, run under the 60Hz of the screen (through an DP1.2 interface to connect to computer) full light mode. As far as Dan says, low light low light mode of screen time of 1 MS is immediately feasible, run under the 85Hz no need for improvements to the circuit, and in the current OLED screens used in 120Hz can be done.

This Panel's biggest problem is that every pixel of the screen and no blue sub-pixel. But I am sorry, I tried for the first time is not even see through, because the content is in a warm yellow-red in a virtual environment. And apparently this is not enough, but Dan guarantees full RGB panels are already in production, integration done immediately online. Will need to judge the quality of the screen, after all, the blue OLED pixels than red and green pixels produces a lot harder is a well known problem in the industry. Therefore, after the screen pixel layout Division may also vary, palpable screen resolution and pixel fill rate leads the curtain effect is likely to change, as is the popular Pentile arranged. But eMagin OLED is obvious at the professional level, I finished good.

The rest of the optical system is top notch.

Order triple lens system creates a relatively wide viewing angle: 80 ° x80 °, looks a bit smaller than the DK2, of course I hope to be able to have the opportunity to put together for direct comparison. Dan claims that in future versions of Visual angle can also increase a bit. Picture has a medium level of geometric distortion (demo software apparently don't have any pre-distortion correction), but I didn't find dispersion phenomenon, because even if there is, it is difficult to detect due to no blue.

Screen is always in focus, after all the physical pupil distance and focal length can be adjusted for each eye. Current prototypes and not through the inductor induction physical focal length values transmitted to the software, and this is much-needed! The focal length of the lens can only be adjusted when the mirror cover on the turn, more difficult. Yes, the lens and screen section to turn on, meaning that can be doubled up when you want to see the outside world.

But the biggest improvement is clearly part of the screen. Compared with the DK2 and GearVR, the resolution difference is obvious. Rift CV1 and Vive screen resolution is 1080x1200, assuming that 100 ° x100 ° viewing angle, the head of PPD (Pixels Per Degree) at 25, CV1 twice times over.

On in AWE, I experienced the world's first eye-tracking VR display

So, I was very excited to imagine the shine to the RGB, 85Hz low-light screen, internal or external location tracking sensor or LED, as well as improved industrial design (reduce the overall weight, improved weight distribution, head band, etc) after it. For the price, there is no any information, but Dan suggests that the original version at the consumer level range, once mass production, the final product is very competitive. So to speak, I can get this thing sooner would be willing to pay a lot of money, I have wanted to for several applications.


On in AWE, I experienced the world's first eye-tracking VR display

I have also been lucky enough to try a castAR. I am quite curious about this thing has been, because around this topic and controversy. First thing to say is, "castAR" anomaly, called "castVR" was appropriate. It is to me, AR is seamlessly added to virtual goods in real-world environments, whether it is through video processing or transparent lenses. CastAR didn't use it in two ways: you need to put a retro-reflective panels placed in front of this thing obviously gets in the back of the Panel. Virtual goods is in the Panel and the space formed between the users. It belongs to the "fish tank" or "contains" VR, function and morphology is closer to CAVE or other screen and projection based VR (for example, Z-SPACE).

On in AWE, I experienced the world's first eye-tracking VR display

Of course, this is something in the text.

CastVRAR how to use it?

Well, in fact, pretty good. For the achievement of a few details before I was skeptical: stereo quality, tracking quality, brightness and viewing angles as well as interactive quality. Disney case

In front of my evaluation of these one by one, first briefly this stuff works, after all the way and Oculus Rift this head-mounted display devices are very different.

CastAR is a head-mounted, but not through the front screen to image, but in the use of his left eye left and right eye to the right of the small projection, retro-reflective materials on the front light, then it will reflect the light back to the consumer's eyes. Due to the projector and the position of the eyes very close, castAR does not need to know the location of retroreflective materials in advance to set up an appropriate projection matrix. After when combined with six degrees of freedom head tracking, castAR in users between the retro-reflective material and virtual objects form a very sound and healthy.

I worry about stereoscopic imaging quality, received information that may occur between the two eyes cross. But it turns out, castAR create an almost perfect stereo quality.

Retroreflective materials mainly back to the left eye left eye projection, projection mainly back to the right eye and right eye, but are limited by a real physical environment, as well as material properties, information impurities it is hard to avoid. But did not think castAR impurities using a polarizing filter to be further removed (just, retro-reflective material is metal, to maintain polarization), which create an almost perfect stereo quality.

Second problem is tracking quality.

CastAR uses optical tracking, based on two cameras on the head (do not know if using two cameras are a stereo camera), as well as an active infrared LED array. Basically, this is the RP DK2 optical tracking system. The tracking system was OK, but not great. When Caton (may be a few millimeters to one centimeter), delay is also more evident, leading to virtual objects drift and non-stable somewhere in space. Despite the infuriating, but it must be noted that this does not and closed VR cause dizziness, as virtual objects are only a small percentage of users to view. When I asked when a castAR said track is totally dependent on the optical, no inertial sensors and data fusion process. But even so, I've seen better optical track, after all the demo track small infrared array is large (approximately 5 ″ X5, ″).

A single input device over the track in the same way, two front-facing camera and infrared array. Caton despite input devices will be less obvious, but the viewing angle of the camera into the main issue: the clear stick only in a direction towards the infrared array tracks normal. It makes you want to pick up the virtual objects, flip them becomes difficult: a hands-on and they didn't respond. I have already used tracking handles, it's very uncomfortable. CastAR guide you through some demos aimed primarily at infrared array to avoid this, but for some common applications, by inertial sensors and data fusion for compensation should be able to track the quality of a great deal of improvement. Disney phone case

On the bright side, the brightness and contrast of the projector is pretty good. I thought the picture was bleak, however, as a result of retro-reflective materials, from the projection onto the back light in the eyes of users lost almost, in the bright light of virtual objects appear bright in the showroom is definitely good. But it's not perfect, has a problem is projector projection angle is smaller. CastAR viewing angle were two factors limit: one is the size of retroreflective materials (any consumer and retroreflective materials of objects invisible), the second is the maximum projection angle of the projector. After I asked the next, the data is castAR projection angle of 70 °, in actual use, though I'm not too concerned.

Finally, with regard to calibration, castAR presentation of the virtual handle and a practical handle is worse then a few feet.

This should not normally occur, especially a handle and an infrared head tracking is based on the same array. I don't know if this is due to the projection, after all, and users are not in the same position the eye, after far from the projected part of retroreflective object will have obvious bias. In this case, still needs to know the location of retroreflective sheets plane equations in order to set an appropriate projection of the variable arguments.

Generally speaking, CastAR and other screen-based VR system effect is something of a spell. It has a great stereo quality (better than LCD 3D TV), good brightness and contrast and then reply to reflector and are less expensive, can be based on the need for a large area covered. So you can do a similar CAVE system, entire rooms all fitted with retro-reflective material, holographic images of the room are free to watch, but would be limited to 70 ° angle projection. CastAR compared to other virtual reality system of the enclosing class, including the CAVE, greater benefit is the ability to allow a space to interact with many people. This is an extremely important point for joint work. Retroreflective materials can reduce the long projection system of mutual information between the pollution caused, and probably much cheaper than those screen-based system. If they can track parts, I'd buy buy buy.


I just tried it, the so-called "world's first eye-tracking virtual reality head-mounted display devices (say no, you said" the first consumer-level virtual reality show "is more like it)" presentation. Two-minute presentations, I felt was in the midst of vision track is good enough, but at the edges was very inaccurate. Demo calibration is required before, you need to make your eyes look smaller. Formal presentation is a very intense shooting game, a bad screen resolution and quality of the evaluation itself.

On in AWE, I experienced the world's first eye-tracking VR display

I noticed a strange effect: when I looked down to see my enemies, aimed shot completely disabled. May be gaming your own 3D light calculation bug.

 SMI eye-tracking based on DK2/AltspaceVR

I also tried another world's first eye-tracking in virtual reality: SMI eye-tracking based on Rift DK2. I've tried SMI eye-tracking based on Rift DK1. The new look a little better: you no longer need to cut a rectangular hole in the lens; new track behind the camera lens, not blocking anything. From a functional point of view, rather than the previous edition and no progress. Even after a complete calibration, I aimed at the direction of and tracking results are biased. One of the shows, I only have to use objects on to the field after the Central track, then make eye-tracking itself is meaningless.

Of course, it could be because I am wearing contact lenses. SMI tracks will take into account the user's eye and eye cornea and contact lenses will change slightly when I looked around my cornea, may cause deviation. But either way, these need to be mentioned. I guess when the VR for sale contact lenses would lead to another trend.

I wish FOVE of SMI and eye tracking accuracy compared FOVE shows makes it difficult for us to do (exactly how big those objects hits area? ) This. Who only can get the FOVE developer version to see.

I would also like to briefly mention AltspaceVR eye tracking integration. It has two parts: navigation and interactivity based on aiming and centering objects or locations, and user virtual eye animation can make user eye and blinking on a little cute robot avatar is mapped to AltspaceVR. Behind this part is actually doing quite ordinary, blinking monitor can only occasionally be monitored. May this new increase in eye tracking needs to be improved, but is an additional window.

Wearality Sky Lenses

This is the second time I tried after the SVVR '15 Expo Wearality 150 ° viewing angle of the lens. This time I have the chance to scrutinize the lenses. VR viewing angles exactly how should we measure it? In General is not easy, but Wearality Sky open design makes measurement much easier. When you put on the head after the show, I can see the real world from under the lens.

On in AWE, I experienced the world's first eye-tracking VR display

After wearing, I adjust their position before the left and right edges of the lens and aligned to the left and right edge of the table, and then see about three feet away from the table, table about six feet wide, so a conversion of Visual angle is 90 °, instead of 150 °. When I asked the people who give us experience this situation, he said 150 ° need for larger screen phones. I don't have to measure the use of the mobile phone, but it seems larger than the Samsung S5, is about 5.5 ' look.

Disney phone case

Wearality's Web site did not mention the viewing angle is based on screen size, although obviously you should know, you somehow have to provide a list of viewing angles/sizes as well. And give us experience are also very happy to continue to let everyone look at 150 ° head, even as he told me this number only for larger screen phones. What was that for?

Then I saw a demonstration can Wearality by adding different sized black border to the rendered image to switch between different viewing angles, but this will allow them to use small size mobile phone adapter lenses and said she has a large viewing angle of more untenable.

Gear VR

338 votes

Gear VR

Gear VR gave me the feeling is similar to popular smart watch, VR equipment there is no ability to completely replace other electronic equipment, especially Gear VR itself also needs to rely on a Samsung Smartphone, and if it is in use, there is an incoming call, SMS or email, users will also need to answer phone out from Gear VR treatment.

View details of the voting >>

No comments:

Post a Comment