Reducing the pain of VR content generation: Autodesk LIVE

Some of you may have heard about the launch of a new Autodesk product, last week, that's focused on generating interactive 3D environments from Revit models. It's called Autodesk LIVE and is part of the Autodesk LIVE Design umbrella (at least I think I have that the right way around).

LIVE Design

 

Autodesk LIVE Design is all about making it easier to apply best-in-class visualization technology from our Media & Entertainment division – who collectively know a few things about high-end rendering and animation, admittedly – to the world of design and engineering.

Here's a video describing how Autodesk LIVE – formerly known as Project Expo, for those of you who know it – works:

 

 

The "one click" publishing workflow from Revit sends the required content to the cloud for processing and downloads the generated 3D scene for local exploration. The local environment is based on Autodesk Stingray – our game engine tailored for design visualization – and gives you all the tools you need to explore this high quality 3D environment on your PC (or take on the road with your Windows or iPad tablet).

This is all very cool, but what's ultimately most interesting to me is what this technology brings (or will bring) to the AR/VR table. Today people who want to take CAD data to an AR or VR environment – often based on Unity or Unreal, which are really both tailored for game content rather than design visualization – have to spend an inordinate amount of time to make the content work in these environments. And by "inordinate" we're talking orders of magnitude more than you'd expect (I've heard customers talk about hundreds of hours to get a basic model ready for VR or other realtime 3D use).

Now I'm not saying Autodesk LIVE Design is going to solve all these problems overnight (or in the immediate future), but the promise is certainly there: at some point this technology is really going to reduce the pain of going from CAD to AR and VR.

I'm excited!

3 responses to “Reducing the pain of VR content generation: Autodesk LIVE”

  1. Whoa! You read my mind here. I forget if I said but I keep thinking adesk needs to make a gaming engine for VR viewing. I use Infraworks and navisworks, which are gaming engines IMO, but they do not handle the caching of landscaping well enough. Then you go to unity, and their built in surfaces are grid based. The civil world does TIN's, and we have custom tools that draw the 3d breaklines from roads and slopes so can make the best tins for the budget. Like you say though, getting that into unity and draping images like IW does is not something we have totally figured out. Those that do compromise too much for roads IMO. Its crazy but if you get VR going for IW or other gaming engine that plays well with tin surfaces, you will so fully eclipse the efforts of those trying to make IW into a design program, that I wonder if that part of IW would survive. The teams must understand what industry cares about I can say the VR end is a grand canyon gap compared to any design tool related thing. Great post.

  2. btw, what happens if you copy that revit house 200 times in a grid, like a subdivision, and add a grove of trees? I'm not saying Unity can handle unlimited 3d models either, and it can't, but it does handle the vegetation and water super efficiently. Its like you can do unlimited trees and it just shows what it needs to at the moment. So while the events and things of unity are extra to people doing project visualizations, the vegetation seems to be the thing adesk progs do not handle well enough.

    1. Not sure - I haven't played with this, myself, as yet. It's exactly the kind of thing game engines do well, though, so I'd hope Stingray would, too.

      Kean

Leave a Reply to Kean Walmsley Cancel reply

Your email address will not be published. Required fields are marked *