Integrating Leap Motion and AutoCAD: Introduction

Leap Motion LogoI've been alluding somewhat vaguely (and sometimes more specifically) to an interesting integration project I've been working on recently. The project is to integrate the Leap Motion controller into AutoCAD, to explore the possibilities of this device.

For those of you who haven't yet heard about the Leap Motion controller, this could well be the hottest tech gadget of 2013. At last week's CES, the BBC interviewed Avinash Dabir from Leap Motion to understand the potential for the device and a little about how the technology works. And here's a very quick demo video:

So what could this mean for CAD? As stated in the "Our History" section of the Leap Motion Story page:

"The original inspiration behind Leap Motion came from our frustration with 3D modeling."

Which I'm sure you'll agree is likely to make this technology all the more relevant and interesting for our industry. 🙂

This series of blog posts is focused on integrating Leap Motion into AutoCAD. In the next post we'll look at how to define some basic gestures for model navigation in both 2D and 3D and move on to approaches for actual modeling, whether a generic or more specific integration (looking at the pros and cons of each).

Firstly, though, a quick comment on driving software via hand gestures. Regular readers will hopefully recognise that I'm all for exploring new UI paradigms – I've had a lot of fun fooling around with Kinect and AutoCAD, for instance – but I have to admit that I'm far from being sold on the idea of using gestures as a sole input mechanism. It might be compelling for casual input – perhaps to a large, shared display or in due course to a mobile device – but I've personally found it very tiring to use hand gestures for any length of time.

I do think, though, that having a device such as the Leap Motion controller available while working with more traditional UI devices (a mouse and keyboard, for instance) could be interesting: you might wave your hand to perform a quick view change, for instance, while you're drafting or modeling with your existing set-up.

That said, who knows where this style of interaction will lead, longer-term: I suspect our generation of knowledge workers is likely to be pitied for having had to suffer for decades from using technology that was not designed with ergonomics as a priority. It could well be that this type of device ultimately helps set us free.

That's it for the quick preamble… in the next post we'll take a first look at integrating Leap Motion into AutoCAD, implementing some basic model navigation gestures.

10 responses to “Integrating Leap Motion and AutoCAD: Introduction”

  1. Sounds interesting! Looking forward to reading the rest of the series. I can certainly imagine having the Leap act as an auxiliary input method, particularly for orbit, drag and zoom actions, but of course how it actually works in practice may be another matter.

  2. wow men i would like to learn how to do this stuff its just amazing you doing a great job bro.

  3. good post, I had not seen that device before.
    Seems to me this complicates the idea of placing program and data on a remote server, reffered to as the cloud by consumer lingo, as now you have this point could data of your hands that must be sent to that remote server. Given the trouble I have seen so far with gittery mouse cursor, and the hoops people have gone through to solve that, sending point clouds in real time seems 10x the problem. Any thoughts on that? I don't think its a real issue, as companies will not turn over control of their software to a vendor, just curious what people at Autodesk think when they see this kind of device surface.

  4. The local processing (the Leap Motion runtime) generates a fairly small set of data that apps use: they get hands with fingers, for instance, rather than having to deal directly with low-level point clouds. Whether or not the added latency of working with a cloud-based service gets in the way remains to be seen (lots of "cloud-based" services still maintain client-side components for UI implementation, anyway).

    Kean

  5. Constantin Gherasim Avatar
    Constantin Gherasim

    Hi Kean,

    I totally agree with you on the fact that any use of hand gestures for any length of time it’s going to be more or less tiring, therefore discouraging the use of it.

    And I certainly can see some reasonable use for the device as you say it, for large, shared displays (BTW, I perceive it as a menacing alternative for the overly expensive SMART Board, at least in a future incarnation able to cover a greater volume), for weather presenters or as a complementary CAD device, replacing those costly 3D input devices like SpacePilot Pro and similar.

    But right now I have another question for you.

    Assuming that you own already this device, could you please say if it has some basic functionality out of the box, something like a mouse or a track pad, or one really has to start using the SDK in order to make any use of it?

    I was trying to find the answer on their forum, but everything concerning this issue seems to be very confusing momentarily.

    TIA,

    Constantin

  6. Hi Constantin,

    It's really a developer-focused tool, at this stage. There's the Leap Visualizer - a standalone app that comes in the SDK - but that really just allows you to analyze the data coming from the device.

    I fully expect a basic OS integration to be provided either along with the Leap Motion controller or in their App Store, once live. To give you a feel, the 3rd post in this series does a very basic Windows Messages-level integration that should work with pretty much any Windows app.

    But, frankly speaking, if the Leap Motion is just used as a mouse replacement - without a higher-level, gesture-based integration - then it will fail: it's far too tiring to use in this way. A mouse would be much better, as you can at least rest your hand on it (rather than maintaining some distance from the device, hovering over it).

    I hope this helps,

    Kean

  7. Constantin Gherasim Avatar
    Constantin Gherasim

    Thanks again for the quick reply, Kean.

    Obviously, if this device would be used only as a mouse replacement, it would be downright ridiculous and useless 🙂

    My question was purely rhetorical, of mere curiosity 🙂

    Regards,

    Constantin

  8. As I recall from the early press on the Leap Motion device, it was envisioned as a way to provide a natural interface for drawing and sculpting. If developers are going to map its abilities into existing crippled interfaces, I don't think it will achieve its true potential. Assemble a system from an existing library of iMapped parts in AutoDesk Inventor using the existing UI. Now imagine that you can pick up those same parts in an environment that supports surface and fit constraints to objects and assemble them as if they were real -- with the added advantages of scaling, zooming, and relaxing surface constraints to put something inside that you forgot. This is where the Leap Motion interface will shine, but it requires re-engineering Inventor to get those benefits. If AutoDesk won't do it, somebody will.

  9. Kean Walmsley Avatar
    Kean Walmsley

    We'll see how this plays out. Other work is going on in this area, as you might expect. The main target isn't AutoCAD - this was a simple prototype to kick the tyres, despite its "crippled interface" (sigh).

    I love the whole Minority Report concept, too, but my personal sense is that it's going to be for casual use only until the ergonomics are dealt with (probably via some kind of haptic feedback).

    Kean

    P.S. No need for all the drama, by the way. It gets tedious.

  10. oops. imated not imapped. shouldn't write late at nite.

Leave a Reply

Your email address will not be published. Required fields are marked *