Today at Slush 2017, Finnish VR startup Varjo Technologies is announcing that their first formal prototype headset will be shipping within the next month, as well as unveiling their collaboration efforts with a number of new development partners. While we don’t often cover these smaller companies, there are a few notable elements to Varjo’s endeavors. First is the hardware itself: Varjo’s VR headset incorporates two displays where the inner central region has very high resolution and pixel density, surrounded by a coarser lower resolution peripheral region. The second is the niche: Varjo is focusing solely on the professional market.

Update (11/30/2017): Article was updated to reflect the latest information available, including concrete details of the Alpha Prototype.

The headset – and the company itself for that matter – was outed this past June when Varjo exited their startup stealth period, and at that time they gave private demoes to press and analysts with a modified Oculus Rift (CV1), calling the setup their "20|20" prototype. In that configuration, the Oculus display provided the 100 degree FOV peripheral and two 1920 x 1080 OLED microdisplays, courtesy of Sony DSC-RX100M4 compact cameras, provided the in-focus 20 degree region in 0.7 inches (diagonal) for a pixel density of around 3000 ppi.

Scientifically, what Varjo is adopting is a form of foveated projection – mimicking the human vision’s out-of-focus peripheral and in-focus center. Foveation is widely considered to be one of the critical steps to the next generation of VR headsets, as matching the human eye's own capabilities is, if executed correctly, a more direct means to having high pixel density HMDs without all of the manufacturing challenges of building a continuous high density HMD, and without the massive rendering workload required to fill such an HMD. Various GPU vendors have already been working on foveated rendering, and now with Varjo's prototype we're going to start seeing forms of foveated projection.

For their new Alpha Prototype HMD, the high density zone, called the "Varjo Bionic display," features for each eye 1920 x 1080 at 8 bpp, with a 35 degree horizontal FOV and refresh rate to-be-determined later. The outer peripheral, or the "context display," runs at a 100 degree FOV 1080 x 1200 at 90Hz and 8bpp, one for each eye. Varjo did mention that current prototypes are running at 60Hz, but that 120Hz rates are currently undergoing testing and is seen as a reachable goal. As for latency, Varjo stated that the the displays are "typical row-updateable displays, with microsecond switch time" and so note that latency is generally low.

The two displays are complemented by an optical combiner and gaze tracker so that the focused regions are in line with a user’s gaze, though to be clear it is the 'projections' that are moving. More concretely, the Alpha Prototype features integrated 100Hz stereo eye-tracking, as well as Steam VR tracking with controller support. As a whole, Varjo sometimes calls these combined technologies as “Bionic display” as well, and in the past has used that term metonymically for the headset itself. While not present on the Oculus based prototype, their headset concepts and test types have external cameras to allow for mixed reality (MR) and augmented reality (AR) functionality via video see-through, as opposed to the optical see-through of HMDs like HoloLens. These elements continue to be absent from the Alpha Prototype, with Varjo looking to bring in mixed reality later next year.

As for miscellaneous support, the prototype does not have integrated speakers or headphone jack, both of which are aspects Varjo wants to include in the future. Only projects compiled on Unreal or Unity will run on the Alpha Prototype, with Unreal 4.16 or later and Unity 2017.1.1f1 or later recommended. Varjo is additionally recommending two DisplayPorts and two USB 3.0 ports, though it is not clear if the HMD can operate without all four of the connections. Otherwise, Varjo will supply all the software updates for the device.

Taking a step back, what Varjo is going for is essentially removing the screen-door effect of current VR HMDs via by offering an area of very high pixel density. But the tradeoff with that pixel density is the limited size and FOV of the OLED microdisplays, and thus foveated projection, with Varjo filling in the rest of the FOV with a lower resolution. Executed correctly, this means that the high resolution display is always aligned with the center of the user's FOV – where a user's eyes can actually resolve a high level of detail – while the coarser peripheral vision is exposed to a lower pixel density. As mentioned earlier, in principle this allows for the benefits of high resolution rendering without all of the drawbacks of a full FOV high density display.

However this is not to say that Varjo's approach isn't without its own challenges. Such a solution only works if A) the displays are visually seamless, B) the displays have synchronized high refresh rates, and C) eye-tracking ensures that the focused high density region is always at where a user’s gaze is, all elements that Varjo is aiming for. These become especially important once video see-through based AR/MR is in the picture. The retrofitted Oculus highlighted some of these concerns, as some press observed jitter due to the disjointed refresh rates; for the Alpha Prototype, the context display still features the Oculus/Vive specifications of dual 1080 x 1200 at 90Hz. It also remains to be seen how objectively capable Varjo's eye tracking can keep up with rapid eye and head movement.

The other side benefit is that using off-the-shelf-ish Sony microdisplays significantly reduces costs and man-hours for rapid prototyping, as demonstrated by the retrofitted Oculus. So in the full context, the idea of mimicking human vision is deeply intertwined with the design and engineering choices, marketing appeal of human-eye resolution notwithstanding.

Another thing to note is that the Sony microdisplays are around $850, which by itself is more than many consumer HMDs already on the market. The Varjo headset is intended for the professional market and will be priced that way; Varjo described a typical user as using dual Quadro P6000s, which is not only an immense amount of graphical horsepower but also around $10000 for the graphics cards alone. Coincidentally, Varjo states that their first professional HMDs will be priced under $10000. For PCs intending to power the Alpha Prototype, Varjo is recommending AMD FX 9590 or Intel Core i7-6700-level CPU performance or better, AMD Radeon RX Vega or NVIDIA GeForce GTX 1080-level GPU performance or better, and at least 16 GB of DDR4 RAM.


Varjo headset prototype at GTC 2017 Europe, featuring NVIDIA CEO Jen-Hsun Huang and Varjo CEO Urho Konttori (Varjo Instagram)

In practice, positioning for the professional market requires a lot of collaboration in regards to workstation applications, certification, drivers, and the simple calculus involved in fitting a specialty technology into a workflow. Which is why the second part of today’s announcement mentions a number of development partners; Varjo lists 20th Century Fox, Airbus, Audi, BMW, Technicolor, and Volkswagen as headliners. And on the GPU side, Varjo is involved with AMD and NVIDIA, though no further details were given. But even at a glance there are interesting avenues to pursue, such as LiquidVR on AMD’s side, and Holodeck or VRWorks on NVIDIA’s.

To be clear, details in general were limited, but this is to be expected. After all, formal prototypes are just now finding its way into development partners’ hands: the Alpha Prototype headset, shipping with Unreal and Unity plugins (or a C++ API if neither are applicable), is slated for select partners before 2018, while Beta Prototypes are specified to begin shipping in Q1 2018 to partners involved in design, engineering, simulation, and entertainment. Varjo is still a developing company, and by venture capital standards they certainly are, having received a round of Series A funding just two months ago.

It’s worth noting that Varjo positions itself as a VR hardware and software company, and is now describing their headsets more as a vehicle for Bionic display. And for the time being, their currently open positions reflect continued effort in software, and for Bionic display Varjo plans to publish API information and developer resources in early 2018, around the time when Beta Prototypes are planned to ship. Now that the “20|20” headsets have successfully courted a number of interested companies, Varjo can look more broadly in both productizing a headset and planning an ecosystem that it could be successful in.

Varjo expects to launch its first commercial headsets in 2018, with a public roadmap targeting Q4 2018 for launching "professional products." Interested parties may apply for early access on their website with the form at the bottom of their webpages; to note, headsets are only loaned for the duration of the early access testing period.

Source: Varjo

POST A COMMENT

19 Comments

View All Comments

  • Diji1 - Friday, December 1, 2017 - link

    I have read people's impression of it at trade shows and apparently it works very well, the image is super clear and very stable. Reply
  • Nate Oh - Thursday, November 30, 2017 - link

    I believe this was definitely how the retrofitted Oculus prototype worked, but as for the Alpha Prototype it's a little unclear - and purposely so, like extide mentioned. According to Varjo's developer FAQ, anyone that has a Varjo headset as part of their early access has to run by them first before showing or demoing to others. So I don't expect us to find out anytime soon. Reply
  • edzieba - Friday, December 1, 2017 - link

    The retrofitted Oculus prototype lacked any movement whatsoever, the microdipslay was fixed to the centre of the view. Reply
  • Nate Oh - Friday, December 1, 2017 - link

    I wasn't trying to say otherwise. I was actually responding to boeush. Reply
  • edzieba - Thursday, November 30, 2017 - link

    From what I can tell, these initial HMDs lack whatever 'secret sauce' they intend to use to physically shift a microdisplay faster than average galvanometer. The central panel is fixed resistive to the head like with the modified CV1s, so you need to point your head at a target and keep your eyes centred in the middle of the view to gain any benefit. The version with the mobile microdisplay will come later. Reply
  • leonlee - Thursday, November 30, 2017 - link

    "As for latency, Varjo notes that the displays have microsecond switch times and thus generally have low latency."

    Did you mean sub-millisecond switching times?
    Reply
  • Nate Oh - Thursday, November 30, 2017 - link

    "Microsecond switch times" is actually what they told me verbatim, so I didn't want to twist their words per se. I've reworked the sentence to include a quote in order to make that a little clearer. Reply
  • theuglyman0war - Thursday, January 18, 2018 - link

    sophisticated weaponized Eye tracking for targeting systems on military crafts ( like the Huey Cobra's targeting for it's Gatling gun ) has been an advanced technology concentration as far back as the late 80's?
    On the other hand...
    Why would that actual effect require a mechanical result to achieve the focus/blur? Between what that foveated tracking tech returns as the point of interest...
    and in combination with an HMD's depth camera/inside-out tracking of the actual depth at that position?

    Why can't those coordinates be enough to simply calculate a realistic render of focus/blur depending on depth returned at any given point of viewer interest?

    Does this optical combiner actually need to do anything other than composite an overlay where ( if AR ) everything CG is simply as blurry as everything else not in focus given every objects supposed/given depth?
    And if VR...
    What optic wizardry would even be needed for the displayed content? Wouldn't the render simply take into account returned depth tracked to calculate depth for stereoscopic separation and convergence and then blur and focus depending on interest tracking? ( interest tracking being the actual/only foveat tech needed? )

    My brain hurtz... So I assume I am missing something not obvious to me and beyond my meager comprehension. ( I can smell my burning brainz bits )
    Reply
  • theuglyman0war - Thursday, January 18, 2018 - link

    damn... didn't realize the subject was so necro... :( Reply

Log in

Don't have an account? Sign up now