Profile picture
Bryan Shumway @bshums
, 44 tweets, 15 min read Read on Twitter
Alright, some thoughts on @magicleap! Been waiting a long time and I’m excited for others to get in, explore Magic Leap, and to understand the possibilities of a good mixed reality experience. #MagicLeap #MagicLeapOne
Packaging and materials are really nice. I was surprised at how easily everything came together out-of-the-box.
Setup was both simple and difficult at the same time - I wish I could’ve entered all my credentials (Wi-Fi, username/password, etc.,) via the companion app and then used the companion app to pair with Lightwear to transfer (similar to Apple’s HomePod setup experience)
Entering all those credentials on-device was cumbersome. Because the device already needs time (or an Enjoy representative) for a certain amount of sizing and fit before use, any streamlining of additional steps required for setup would greatly help reduce pain felt during setup.
The overall control scheme/input feels good! But also a bit inconsistent so far. Certainly this is a new platform and mixed reality is a different beast than other digital platforms, but each application seems to be controlled by an ever so slightly different set of inputs.
Some rely on the touchpad, others use head tracking, some use point/motion control, and still others use all of them as inputs. I assume much of this is really a difference between the Landscape and Immersive apps, but nonetheless it caused a bit of frustration.
I’ve had to dig around and go through more trial and error than I had anticipated, including a few instances of the Control freezing up entirely, requiring a reboot. Things I’m sure the team is working on, but they’re worth sharing anyway.
The visuals are fantastic, with a nice amount of depth and detail. No screen door effect, no ghostly transparency. The digital lightfield is real! Auto-brightness settings (configurable) also help what is displayed appear vibrantly and consistently with what is around it.
You’re unlikely to confuse digital objects for real objects - I placed an @NBA game next to my television also playing an NBA game & the difference was clear. However, these objects & images are much better representation than what we’ve seen before in these types of experiences.
Meshing was fast and easy, and barring a few difficult surfaces/shapes, the various rooms I’ve used the Lightwear in have been mapped well, and on-device storage of previously meshed areas really helps enable quick returning sessions by remembering where each object should go.
I’d like to see the ability to re-mesh an area, as I did notice some degradation in the mesh over a few separate sessions.
Because the meshing is generally very good, objects deal with occlusion better than I expected. In #Create, the Knight figurines jump off my tables & ledges, the Fish swam behind my couches & chairs, & the Dinosaurs navigated obstacles as they rampaged through my apt.
The app #Create gives the user a variety of tools to play with within space, but because of the mesh, objects in Create interact w/ the space & with each other in really impressive ways.
Red Knights fight Blue Knights. Dinosaurs eat Hamburgers. UFO’s abduct Sea Turtles. Then there’s the Domino Set, the Marble Track, & various other tools for Rube Goldberg wannabes. I’m ready to see the next ‘Biisuke Ball’s Big Adventure’ (search for it and thank me later).
The audio cues in both #Create & #Tonandi add another layer to the experience, helping tie together physical, digital, & the way those two interact in space. When touching a fiery object in #Tonandi, heat was almost perceptible. Beyond that, each melody is relaxing & delightful.
The web browser feels practical, but like most web browsers not navigated via mouse, it also feels difficult. W/o the browser placed in a large window, text felt small/difficult to read, & though it’s configurable, it took some getting used to & impacted formatting of some sites.
The @NBA App experience was enjoyable, particularly the Court View, which presents the user with a miniature/volumetric video capture of the basketball game in addition to the standard broadcast view; an experience that certainly appealed & added a wow factor for me.
I can envision additional sporting events, maps, construction site blueprints, agricultural crop imagery, etc., with that sort of volumetric view.
The @Wayfair browser experience takes the ‘place a piece of furniture in your house’ idea & gets closest to executing it as intended. The mesh allows for accurate placement, the wearer can perceive their real space, & allows for greater field of view than smartphone AR.
To try to get an understanding of a potential user flow that combines each experience into an overarching experience, I imagined the following scenario. I’m a remote worker or traveler, and I work daily out of hotel rooms, @WeWork spaces, etc.,
Using a device such as @magicleap can let me instantly set up multiple screens, browsers, content windows, & experiences, even in an empty room w/o other physical displays or hardware. In addition, because I’m traveling, my space is not mine, & I can’t bring my own life with me.
With @magicleap , I can bring objects & images such as photos, books, objects, furniture, & other pieces to make my spartan hotel room or @WeWork space feel like home. The device remembers where everything is placed, & when it’s time to leave I just power down the device.
It’s a future that is still a bit hard to imagine, particularly when most of us need to do some form of input, editing, & creation. However, I started to get a clearer view of how a device like this could be used beyond gaming or entertainment. I’m excited to create that future.
More @magicleap thoughts, photos, and video to come! Again, great work from the team! @rabovitz @_daveshumway
To give you a taste of the joy @magicleap provides, following are some quotes from those who’ve demo’d on my ML1:
“I feel like I have real fish in my house right now!” (Create)
“I have my little knights riding sea turtles and they’re adorable!” (Create) @magicleap
“What?!? My spaceship just absorbed my jellyfish!” (Create) @magicleap
“*GASP* Blocks! I’m gonna have SOOOO much fun with the blocks.” (Create) @magicleap
“I love that I can feed the fish! I could do this all day! This is so cute!” (Create) @magicleap
“The music is so peaceful and soothing” (Create) @magicleap
“Oooo an explosive barrel? That seems exciting” (Create) @magicleap
“The red knights are winning! Time to hit em with a T-Rex!” (Create) @magicleap
“My hand feels like it’s getting hot! It’s heating up!” (Tonandi) @magicleap
“Whoaaaa, look at lil Lebron @KingJames dunk! That is SO COOL!” (@NBA App) @magicleap
“I just bought a house. I’d LOVE to sit sit here and design my house and furniture layout with this.” (@Wayfair in Helios) @magicleap
Other thoughts: I wish I could see what others see - perhaps via the companion app?
Other thoughts: each demo session went at LEAST 45min with the user not even noticing the passage of time
Other thoughts: the users were SO ENGROSSED that they couldn’t turn attention anything I said to them while demoing.
Other @magicleap thoughts: I’ve yet to take a look at accessibility overall in the UX, but I‘ll note voice typing, closed captioning, coming prescription lenses, & one more thing as signs of accessibility & inclusivity that should be applauded - the shoulder strap for Lightpack.
Generally Lightpack ‘clips’ onto a sturdy pocket for an untethered experience. However, not every person has pockets, stands during use, or otherwise has means to clip Lightpack to their person. Included with of the @magicleap device is a shoulder strap for Lightpack.
Surely this could have been sold as an accessory, but including it sends a nice message that @magicleap wasn’t just thinking of certain group of early adopter men in tech as their audience. The strap enables every person the same experience. Bravo 👏
More video of the Create experience and some of the things I described in earlier tweets.
Video showcasing how I can use @magicleap to enhance a space. Gallery pics to make it feel like home, digital screen for content, physical screen for content, browser for web content. Because ML1 recognizes the space, it is automatically set like this every time I use ML1 here.
Notice also how control moves between applications just by shifting head tracking/eye focus. It’s really slick. @magicleap
Missing some Tweet in this thread?
You can try to force a refresh.

Like this thread? Get email updates or save it to PDF!

Subscribe to Bryan Shumway
Profile picture

Get real-time email alerts when new unrolls are available from this author!

This content may be removed anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just three indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member and get exclusive features!

Premium member ($3.00/month or $30.00/year)

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!