AR development

Discussion in 'Instruments / Avionics / Electrical System' started by Hephaestus, Jun 3, 2019.

Help Support HomeBuiltAirplanes Forum by donating:

  1. Jun 3, 2019 #1

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM
    Anyone here a real programmer?

    After playing with a DJI drone with the Epson AR glasses last week, I really am starting to wonder about the next step for GA...

    I'm more of a copy and paste Arduino sketches to make a program work... So I'm asking what is way beyond my skillset this week.

    Anyone have an idea what it would take to make use of something like the Epson glasses, and start adding in bits like a HUD?

    Gonna start reading on arcore programming, see if I can start to wrap what's left of my mind around it lol.
     
  2. Jun 3, 2019 #2

    pwood66889

    pwood66889

    pwood66889

    Well-Known Member

    Joined:
    Feb 10, 2007
    Messages:
    1,397
    Likes Received:
    134
    Location:
    Sopchoppy, Florida, USA
    Well, Heph, lessee here. Got my BS certified as computer science option of math degree. Programmed in CoBOL, FORTRAN, and several flavors of assembly languages. Some proprietary stuff (any one heard of PowerHouse by Cognos?) and BASIC of course. Last go round was Java on Linux/Unix. So you're looking at sending drone signals to Augmented Reality glasses? Like you can turn your head and the glasses adjust the image like you were in the cockpit? I think I'd need a bit better idea of what you were trying to accomplish before I start asking about inputs and outputs.
     
  3. Jun 3, 2019 #3

    FritzW

    FritzW

    FritzW

    Well-Known Member

    Joined:
    Jan 31, 2011
    Messages:
    3,541
    Likes Received:
    3,156
    Location:
    Las Cruces, NM
    I've played with writing a few VR apps for Android using Unity VR. It's probably not that different than writing an AR app. (...except all the "augmented" part happens through the camera). Do Epson glasses have head tracking?

    Just find an Epson SDK that has good youtube tutorials ;)
     
  4. Jun 3, 2019 #4

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM
    Actually I want homebuilts guages, and my Mooney's g500 panel interpreted, then a basic aoa, speed slope info overlayed. Really for most it'd be an onboard black box to run it's own set of sensors, display them up on your headset so you're less prone to getting lost in the panel.

    Adding adsb will turn into the killer app I'm sure, traffic highlighting within a mile - use the data highlight ballpark area to expect it. using the camera to pinpoint would be super cool but I expect that would be a processing nightmare.

    These exist for drones already, not so much for GA.

    The Epson do have a gyro, on the dji it's connected to the gymbal for the camera. Took about 10 min to get used to the overlays, even over my glasses. It was pretty much perfect.
     
  5. Jun 3, 2019 #5

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM
    *Shudder* yes... Actually - Yes I have... Back to the days of working at systemhouse crawling through crawlspaces in a suit and tie, because that's what IT schmucks got to do and how they were to dress. Lol.

    At least the server rooms were clean. But...
     
  6. Jun 3, 2019 #6

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM


    That's what aero glass was headed. Some stuff I like... Some I hate.

    I was thinking more like military HUD [​IMG]

    [Stolen] from flyingmagazine.[/stolen]

    Maybe you guys in busier and more zoned airspace would want the crazy zone walls etc.

    I think the military has invested enough to make it clear what's needed, to me I'd be more interested in attitude, airspeed, direction & routing would be kinda cool. That 3D adsb output - that would be awesome as long as it doesn't overwhelm the screen like on the paragliding parts of aeroglass's video.

    Do you really need boxes to fly through as your taking off? Probably not, the runway taxiway stuff could be cool though
     
  7. Jun 3, 2019 #7

    FritzW

    FritzW

    FritzW

    Well-Known Member

    Joined:
    Jan 31, 2011
    Messages:
    3,541
    Likes Received:
    3,156
    Location:
    Las Cruces, NM
    The Auriga app has a very clean HUD. After a while you don't even see it anymore, it just sneaks the data into your brain ;) You could (almost) just velcro a $35 flight controller to the airplane and run any of the play store apps.

    Some more info here
    [​IMG]
     
  8. Jun 3, 2019 #8

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM
    That looks more correct.

    Now how do you take the drone out of the equation?

    I liked the Epson because they're see through, meant to go over glasses. So you don't really want the video feed just the data overlay...
     
  9. Jun 3, 2019 #9

    FritzW

    FritzW

    FritzW

    Well-Known Member

    Joined:
    Jan 31, 2011
    Messages:
    3,541
    Likes Received:
    3,156
    Location:
    Las Cruces, NM
    For prototyping/proof of concept it could be as simple as velcroing a drone flight controller to the airplane. You'd have 9 DOF and an OSD. If you didn't have it connected to a camera (or video receiver) the background would be null. ie. not black, just empty.

    A much better, and probably just as easy, way: Start poking around on sparkfun, there's a video of doing it (at least collecting the data) with a teensy and a 9 DoF sensor.





    This may not be the best hardware for the job, it was just the first stuff to show up on a google search.

    14057-01.jpg Teensy
    13944-01.jpg 9 DoF sensor
     
  10. Jun 3, 2019 #10

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM
    The Epson glasses (300 I'm guessing 350s are 600$ more) have those sensors already built into the glasses + the base.

    Getting the data isn't hard, it's that processing it into a overlay video that gets ugly ;)
     
  11. Jun 4, 2019 #11

    Vigilant1

    Vigilant1

    Vigilant1

    Well-Known Member HBA Supporter

    Joined:
    Jan 24, 2011
    Messages:
    3,898
    Likes Received:
    1,717
    Location:
    US
    If a plane is in Class A airspace or otherwise in a place where all traffic is under positive control, then a HUD with all kinds of garbage on it is fine. Similarly, if a fighter is below FL180 but running his AI radar so his HUD will actually highlight my location so he can avoid me as I drone along, then that is great. Otherwise, I'd prefer that my little dot of a plane remain plainly visible to others and not hidden behind the fuel level depiction or other distractions. In VMC, all of us are supposed to be seeing and avoiding. ADS-B is not a requirement in most of the airspace below Class A. There's plenty of time for video games when at home. Thank you.
     
  12. Jun 4, 2019 #12

    pwood66889

    pwood66889

    pwood66889

    Well-Known Member

    Joined:
    Feb 10, 2007
    Messages:
    1,397
    Likes Received:
    134
    Location:
    Sopchoppy, Florida, USA
    K. Won't have to relearn QTP :)
    Now to get onboard with the Epson AR glasses - what do they take as inputs?
     
  13. Jun 4, 2019 #13

    Hephaestus

    Hephaestus

    Hephaestus

    Well-Known Member

    Joined:
    Jun 25, 2014
    Messages:
    1,148
    Likes Received:
    246
    Location:
    YMM
    They're basically a full featured Android device on their own.

    https://epson.ca/For-Home/Smart-Gla...-Glasses-(AR-Developer-Edition)-/p/V11H756020

    My hunch would be use a separate microcontroller to poll can-bus or other sensors for data. I'd think you'd want that device building the video stream and sending to the glasses as a stream? Because you wouldn't want the glasses getting hung up or slowed down and lagging...
     
  14. Jun 5, 2019 #14

    pwood66889

    pwood66889

    pwood66889

    Well-Known Member

    Joined:
    Feb 10, 2007
    Messages:
    1,397
    Likes Received:
    134
    Location:
    Sopchoppy, Florida, USA
    Epson says "Enhanced connectivity — dual-band wireless connectivity and Bluetooth Smart (BLE) enable integration with a wide range of accessories."
    Also says "Operating Systems:Android 5.1" so I guess you just need to build on to the BLE and go from there.
     
  15. Jun 5, 2019 #15

    rdj

    rdj

    rdj

    Well-Known Member

    Joined:
    Sep 10, 2009
    Messages:
    288
    Likes Received:
    185
    Location:
    Northern California
    Possibly, either via WiFi or the USB connector on the controller. Note that the glasses are connected to the controller, and the controller is where most of the processing is occurring. The Intel Atom in the controller may or may not have sufficient power. The tricky part is that the glasses are also sending head-tracking data back to the controller. Simple 2-D graphics such as airspeed etc. don't care about that and those displays can be done simply by writing to the overlay plane in the video subsection of the Atom processor on the controller (I'm presuming it has one; most modern micros do). However, 3-D graphics like real-time ADS-B traffic etc. require some knowledge of 3-D mathematics and the Android OpenGL graphics stack (or maybe Vulkan for Android 7 or higher) if done on the controller, or similar if the head-tracking data is sent all the way back to your separate microcontroller. You'll need to be able to perform the 3-D transformations on both the sensor data (ADS-B traffic position) and the current head-tracking data from the glasses to accurately position the traffic in the field of view (quaternions are your mathematical friend here), and some further manipulation to generate the appropriate stereo separation (for each eye) is also required. Not an easy project for a beginning programmer.
     

Share This Page

arrow_white