augmented reality – As Seen Through PeriVision https://www.perivision.net/wordpress An Mobile centric blog ... Full of Tech goodness Fri, 27 Aug 2021 01:35:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 4666035 TikTok parent Co. ByteDance may buy Pico, why that is big news for AR/VR https://www.perivision.net/wordpress/2021/08/tiktok-parent-co-bytedance-may-buy-pico-why-that-is-big-news-for-ar-vr/ https://www.perivision.net/wordpress/2021/08/tiktok-parent-co-bytedance-may-buy-pico-why-that-is-big-news-for-ar-vr/#respond Thu, 26 Aug 2021 21:29:46 +0000 https://www.perivision.net/wordpress/?p=10258 Read More]]>

Well, seems the deal went through! Right now, its still a rumor, but let say the ByteDance Pico deal goes through. What does that mean for the AR / VR space?

Currently the consumer VR space is pretty much dominated by Facebook with the Oculus Quest. A very good device, lots of content, very affordable price. With the release of Workrooms, we are seeing a slide into the soft enterprise space. (more on Workrooms later). Ok, so why is TikTok and Pico so interesting?

TikTok has followed both FB and Snap by offering an AR development platform. Snap has already released a dev version of AR glasses that can take advantage of their AR platform. And remember, they bought WaveOptics as well, so clearly they are committed to creating an AR device in the near term. What is interesting is that they are positioned to be the first AR glasses company to sell in large numbers give Snaps reach. Although I’m certain FB will come out with consumer level AR glasses very soon, they do not have the same penetration that Snap has for user AR content creation. Given TikToks current reach on in the user creative space, it would not be hard to imagine ByteDance is looking to compete directly with Snap. And the two of them need to move fast to buffer FB’s AR ambitions.

So how does Pico play into this? ByteDance recently bought a gaming studio to help bolster their content pool, but I think this will mostly be for the Chinese market, VR is pretty much owned by FB right now. However, if ByteDance wanted to create AR light glasses somewhat like Snap, Nreal, and Lenovo, this would be a step in the right direction. I would not be surprised to see a follow up acquisition of Nreal, or failing that, MadGaze.

This gives ByteDance quite a bit power in the Chinese market.

  • Continue to expand the VR Consumer space in China perhaps with more gaming studio acquisitions.
  • Support AR pass-through and copy FB Workrooms to create a Chinese version. Remember Pico using the same X2 chip Quest uses, so its doable.
  • Create a set of AR glasses, either through acquisition or other work that Pico has been doing that we do not know about.
  • Leverage TikTok popularity to compete with Snap and soon FB in the western market for consumer AR.

So what is missing?

Right now, both Snap and perhaps ByteDance still have a content issue. Yes, AR filters are very popular, but that will not be enough to sustain interest. To succeed they need access to the consumers phone which offers FAR more content than any one walled garden can ever create. To do this they need to figure out how to take mobile 2D content and display and interact with it on the XR device. This is why I wrote about Samsung Dex for AR-lite in the past. But guess who else has a Dex like experience? Huawei.

I’m calling this now. Watch for a future agreement between ByteDance and Huawie for a mobile based desktop experience that works in both VR and AR.

Also watch for further moves by ByteDance to implement AR glasses. Its getting closer … perhaps 2023 is when we will see the first major plays into consumer AR.

Share and Enjoy !

Shares
]]>
https://www.perivision.net/wordpress/2021/08/tiktok-parent-co-bytedance-may-buy-pico-why-that-is-big-news-for-ar-vr/feed/ 0 10258
Snap Glasses are ok, but $500M for WaveOptics is really interesting https://www.perivision.net/wordpress/2021/05/snap-glasses-are-ok-but-500m-for-waveoptics-is-really-interesting/ https://www.perivision.net/wordpress/2021/05/snap-glasses-are-ok-but-500m-for-waveoptics-is-really-interesting/#respond Sat, 29 May 2021 16:34:59 +0000 https://www.perivision.net/wordpress/?p=10249 Read More]]>

Recently Snap dropped on us their new dev version of AR glasses. There was not much on specs, save 2000 nits (1000 per eye I guess) waveguide displays. Since its tethered I would guess its pretty much following most of the other AR glasses I have seen pop up based on the QC chip.
However, what was really interesting to me what the announcement from Snap the following day. $500M purchase of WaveOptics. This was a signal to me.

Typically announcements like the Snap DK glasses, or Lenovo, or even the Niantic announcement always feel like a soft foot in. Basically trying to gauge interest while at the same time work with developers to create content, but more importantly, shake out the bugs in the wearable as well as eco system for an AR experience be it a hardware, software, or eco system play.

M&A, however, always catches my attention. Yes companies like Apple and Facebook are buying companies left and right, however, when those companies that are not quite as washed in cash start spending, that tells me something.

I have been around for a while, seen the first great transformations, and thus VR bubbles, in the late nineties, and again back around 2016’ish. And after both great leaps in XR capability, it always seem to follow a hype that XR will explode on the market.

ABI Research Reports $2 Billion+ in Funding and M&A for AR/VR

Although I do not believe we will see growth like this prediction above, something I do believe is the market will grow, and in 2025’ish we will see a 3rd revolution in XR systems. I’m not the only one who sees this, in fact, its somewhat expected 2025 will be when we really see AR glasses. Eco systems are already being built and developers being invited in. Its 2025 and after that I think we will see a change in consumer habits and substantial market opportunities.

So returning to why I think Snaps purchase of WaveOptics is so interesting. To compete in the new AR market, you need 3 things: Experience delivery. In this case, light weight attractive glasses. Content. Simply having a few cool apps is not going to do it. You need just as much content as we have on our phones today. Finally, users. Does not matter how good your system is, if you do not have a user base ready to consume it.

The last two takes quite a bit of investment unless you already have access to the phone (be it Apple, Google, or a major platform player like Microsoft, or a major phone manufacture). So hardware is the 3rd part where you can make a play to create content for your eco system and hopefully build a user base. Snap has already tried this with their first foray in to camera only glasses. Didn’t work out all that well. But the concept is still sound. If you own the hardware for display and/or input, you have at least some control over the content created for AR consumption, be it wearable or currently, phone based. Thus the investment in WaveOptics.

Snap has a very good customer base. Its apps are popular. So if they can also create a good set of glasses that enhance the particular content that its use base already enjoys, it can find itself well positioned to insure its apps are part of any AR eco system in the future. And that is the key here. The future play. As we move into a new eco system that is part phone and AR glasses based, how do you insure relevancy?

$500M is no small M&A. Snap is betting that they can get enough glasses out there to insure content for their eco system is sustained. This current device is clearly a DK, but you do not drop .5B on a company unless you plan to build future devices.

In the future, near future, there will be an interesting battle of eco systems. Open system and closed. Apple will create a closed system as we can already guess. Niantic is creating a metaverse, as is Nvidia for engineers, as is Epic which just raised $1B for this effort. However, if AR Glasses can talk to phones like Bluetooth devices can, then the question will be, how do these glasses consume content and access these various metaverses? If for example, Google creates a protocol to allow any wearable AR device to access and display phone content, then those apps that are already matured for the AR wearable market will have a substantial advantage over other apps, and in fact, will find they can grow and capture new users with features that other apps that are not already matured may find themselves displaced. Those apps in turn, can decide which metaverse(s) to support.

This I can see being worth a .5B investment.

Share and Enjoy !

Shares
]]>
https://www.perivision.net/wordpress/2021/05/snap-glasses-are-ok-but-500m-for-waveoptics-is-really-interesting/feed/ 0 10249
With consumer AR; function follows form. Consumer AR use cases for Nreal glasses https://www.perivision.net/wordpress/2020/06/with-consumer-ar-function-follows-form-consumer-ar-use-cases-for-nreal-glasses/ https://www.perivision.net/wordpress/2020/06/with-consumer-ar-function-follows-form-consumer-ar-use-cases-for-nreal-glasses/#respond Wed, 01 Jul 2020 04:37:24 +0000 http://www.perivision.net/wordpress/?p=10216 Read More]]>
Yes that is a Bass ukulele on the wall

I’m going to start this post with this statement: With consumer AR; function follows form. Now for most of you, you might remember the old design mantra; ‘Form follows function.’ For those that are unfamiliar, it basically implies design what you need first, then the form will follow. As a past Berkeley Architecture student this was drilled in my head. In consumer AR however, I think this is flipped. When you wear something, especially on your head, design matters. Given two products, a less attractive but more capable product vs a more attract but less capable one, people will tend to the more attractive product. Function follows form.

UPDATE: There has been an update to Android so now Dex no longer launches when connected. Sad Panda. 🙁

Cue the Nreal AR glasses. They look like ODG glasses, but smaller, and this makes a big difference. I really like the looks of the ODG, but just to bulky, heavy, and hot. Nreal removed all the compute of the head to slim it down. Yes, you lose some performance by moving CV to the phone, and yes there is now a chord hanging down the back, but given the value provided, this is minor.

In consumer AR, function follows form. Lets look a the Hololens/Magic Leap vs the Nreal. Now, they are completely different devices as far as I’m concerned. The Hololens is STILL the best AR device you can buy, but its big and expensive. The Nreal is small, light and only $500. However, what is really important here is; I can be at a cafe with the Nreal and feel OK, where I would not with the Hololens or any other capable AR device on the market. Why? Because the Nreal looks almost like a normal pair of sunglasses, yes it is far less powerful the other AR headsets, but it still supports most of my AR use cases. As such, I argue you need two things for a successful consumer AR device; a very light attractive device and access to lots of content. Combine the Nreal with a phone (better with a desktop like DeX) and you have it.

ROI: When I first got the GearVR I loved it. I was traveling a bit back then and I LOVED the fact I could put these on my head and I’m watching videos and playing games. The device only cost me just over $100 bucks and I already had the phone. The ROI was easy. I got more value out of the device than what I put into it. I also got the Vive. After a few weeks I felt done with it. It was far too troublesome to set it up. Yes the Vive is better, and could do more, but I could throw the Gear is a bag and take it anywhere. Thus I used the Gear FAR more often than the Vive.

Which brings to the main point of this post.

Consumer AR use cases will have the same ROI question. How often will you REALLY use it, and given those occasions, was it worth the money/trouble? I have read so many articles talking about all these great consumer AR use cases, virtual shopping, fitting new furniture in you place, even playing a virtual piano. How often do you really think you will use it? Most likely you will use it often if three conditions are true: 1) Easy to use/access, 2) satisfies a frequent use case, and 3) if in public, easy to carry and looks good.

Use case examples:

Before I go into these use cases, keep in mind they are not definitive or exhaustive, but instead illustrative to support the point that use cases that will support frequent use, does not need an expensive AR device to derive value from the product, in fact, less is more.

Short media consumption on the go: I define this as short sessions (say 10-20 min), alerts, short video, etc. Anything that is a quick bit of information consumption and interaction but longer than something you can see on your watch. For example; reading emails or socMed posts, viewing videos or photos. Yes you can do this on the phone, but if I have easy access to my glasses, I can enjoy a larger screen, better sound, and not keep looking down. I’ll also include HUD here as well. Walking or biking where you can get alerts and/or navigation or point of interests.

Information Augmentation. This can be language translations, getting more information on a store product, bar or QR codes, object recognition, etc.. Anything where one bit of information can be modified or expanded upon. In this case the phone would be the primary camera for capturing information allowing greater flexibility when trying to capture.

Dex seen through Nreal

Augmented Desktop: Right now this is limited to Samsung phones as far as I know. When I connect the Nreal glasses to a Samsung phone, I get a DeX desktop. This is a big use case for me and something I can use right now when I’m away from my desk or traveling. When I plug the usbc cable into the phone, I get a 1k Desktop. Add bluetooth mouse and keyboard, and I’m set. Bonus points, as long as you are not trying to use the same app, you can run desktop on the glasses and some other app on the phone. (still need to test this more) BTW, you can use the phone as a mouse and keyboard as well. The only downside is its headlocked. I’m hoping this can be addressed.

Fits in my pockets.

Side bar on Travel: I used to travel quite a bit, for fun and business, and like to travel light. My current set up is to use a Samsung Fold, folding keyboard, and flat mouse. Well now with the Nreal, I will be adding one more device to my mobile war set. I have the Samsung Fold which is already a big screen, but now with the glasses, any phone should be fine. Again, we still have the issue that DeX is head locked. Side note, I still need to see if I can power the phone and keep the glasses connected at the same time. Another post for another day.

Samsung Fold, Folding Keyboard, MS Curve mouse, and Nreal glasses
First test with DeX with Nreal:

So, we just looked at 4 use cases that the current Nreal glasses and a high end phone can support. What about the chord? I got used to that really fast. For years we have been listening to our music using wire headphones and ear buds and no one has batted an eye. I have the same feeling with a chord coming from the glasses. It does not feel awkward and does not get in my way having used wired headphones for most of my life. BTW, works with a USBC extension.

Each use case has the same thing in common; they are things you would do often. They do not require too much compute power nor sensing technology. They do, however, require that you feel comfortable using them in public.

Now, I do not want to give the impression the Nreal is perfect. Far from it. It does not fit me well, (Nreal: Day 0). When I use DeX, the display is head locked. There is no easy on/off for the glasses while connected to the phone. and a few more small issues. But the main point is this. The Nreal is the first AR-lite glasses I have seen (pun intended) that sets the bar for all other AR-lite headsets to follow in the near term. Sure sometime this decade we will have wireless glasses with higher resolution and detail hand tracking and a bunch of other stuff.

But I do not want to wait that long.

Share and Enjoy !

Shares
]]>
https://www.perivision.net/wordpress/2020/06/with-consumer-ar-function-follows-form-consumer-ar-use-cases-for-nreal-glasses/feed/ 0 10216
Nreal AR Glasses day o: The good and the bad https://www.perivision.net/wordpress/2020/06/nreal-ar-glasses-day-o-the-good-and-the-bad/ https://www.perivision.net/wordpress/2020/06/nreal-ar-glasses-day-o-the-good-and-the-bad/#respond Sun, 28 Jun 2020 01:22:20 +0000 http://www.perivision.net/wordpress/?p=10194 Read More]]>
Nreal Dev kit

I just got the Nreal AR glasses dev kit. I’m going to skip unboxing and intro stuff since there are plenty of those on the web already. So this is a day 0 review, what I did as soon I got them. So lets jump right in.

First off, the glasses do not fit me properly. I had a chance to look at the Nreal glasses last year and knew this would an issue. The fix? Well since I have to wear glasses anyway, I fixed it with a few bread ties. Almost all engineering problems can be fixed with duct tape and bread ties.

Glasses hack

Now that I can see clearly, I first started up the puck and tried the demo experiences. They are good. Again, plenty of reviews on the web so unless there is a particular question, I can say in general the colors are bright, rendering seems around 30-45 fps, FOV is good enough. Images do have some swim and jitter, but that is dependent on the environment. In a perfect setting, the image hold position pretty well, more challenging, less stability as one would expect. Rotation correction (timewarp or pLSR depending on who you talk to) is pretty good. I point this out because I assume all the LSR functions are on the phone as well, but its possible pLSR is done on the headset. The NrealTower app does not work for some reason, still trying to sort that out. However, I’m not that interested in those, instead I want to connect to my phone.

<caution: segway rant ahead> A LONG time ago, I went to visit ODG in SF to have a look at their AR glasses. I have a background in 3D graphics, VR, and computer graphics going back to 1995, so I immediately saw a problem with their approach. This is going to be a world of hurt because of the power/heat requirements. So I asked; hey, can I get a set of glasses with only camera and IMU support? I want to run everything on the phone. Got a look like; ‘Are you nuts? No.’ I left there disappointed, but convinced that a successful AR HMD needs to do two things; be very lightweight, and have lots of content. Connecting to the phone does both. Our phones are powerful have as much access to content as our PCs. If I can get that content on my glasses, that would be worth the price. Que in something called Samsung DeX, and now I can have desktop experience in AR! Full disclosure, I work at Samsung in R&D and got a really hacked version of DeX in working in VR and AR a few years ago so I have already experienced how cool this could be. 🙂

Mobile work center: Samsung Fold, Bt. kybrd/mse, and now Nreal

So back to the Nreal. ALMOST right out of the box, I connected the glasses to my Galaxy S10 (later Samsung Fold) and boom! I have HDMI. BTW I say almost because you need to hold down brightness up to get into 3D mode on the glasses. So once that happens, DeX!! I already have a travel set up with Bluetooth keyboard and mouse so I was up right out of the box. A limitation right now is the display is headlocked. This is fixable by setting up a virtual display and getting RGB images from the video buffer to the HMD after performing timewarp or pLSR. Or something like that. I’m sure they will figure it out sooner or later. Once they do that, then I can have two screens, DeX in the glasses and the normal phone display. I was pretty impressed I could have both running at the same time. No sure how long this will last from a power point of view, so I need to experiment with this more. So all and all, my experience in DeX was great. Images are bright and clear. Easy to read text and if you keep the brightness down, no eye strain.

View of DeX through the Nreal

On day zero I had a web meeting that I needed to attend. As an experiment, I login via the PC like normal, but instead of using the PC, I connected the glasses. Same thing, HDMI headlocked video, just on the phone. I was able to participate in a 1hr online meeting using just the glasses, and bluetooth mouse. I was sitting up on the bed fully relaxing, drinking tea and had no problems reading content on the power point slides. I could even reply to messages when I later added my bluetooth keyboard. Tying on the phone is a bit if a challenge with the Nreals being headlocked.

This event is pretty key for a zero day experience. I was able to use the glasses right out of the box. Later I watched the news via YouTube on the phone. The only thing that stopped me using the glasses even longer is the hack I have to attach to my glasses are not the best. Once this is sorted out, I can see using these for long conf calls so I can walk away from my desk now and again.

So to sum up, on day Zero I had access to via the phone to a desktop experience (DeX), web browsing, office docs, videos, a few games, chat, photos, etc.. I know what you are going to ask; then why not get a HDMI HMD viewer? Few reasons. Headlocked is still a limitation. To free up headlock we need 3 DOF, min. 6 DoF is better in my mind. Plus the are going to be times I will want to full AR experience. I do not want to buy a device that cannot support this.

BTW: Follow up to this post. function follows form: consumer AR use-cases

Now some questions I got since people found out I had this.

Q: did you find the cable an issue? No, not really. I have a usbC extension cable so I feel pretty free, but mostly I just place the phone where ever and its fine. My wish right now is for DeX phone UI to support keyboard in addition to mouse entry. So this is a Samsung issue.

Q: Does it get hot? yes, but the heat dissipation is top front, so I never really feel it.

Q: Does WebXR work? I read that its coming end of the year. I will try to get invited as an early tester.

Q: How is your experience comparing Hololens 2.0? I’ve said many time before, the hololens is the best example of AR you can get. V2.0 is better but not a massive jump. That being said, the CV on the hololens 1 is better then Nreal. The graphics feel about the same as far as brightness and ease on my eyes. (headset fit not withstanding). I do not think the Nreal will ever be as good as the HL in terms in CV, but it does not need to be. Two different products with two different customer segments.

Q: Does it feel heavy after long use? Hard to say. Longest I have used it is 1.5 hrs and both times I was either laying down and reclining, so the weight was a bit more off the nose then if I wore it standing. So more testing to do.

Q: How was reading text? Again it was headlocked so until they support at least 3DoF, I will hold back judgement, but while headlocked, I could read fine.

Q: Can you connect to a PC? Yes. you get pretty much the same experience as DeX. Same problem though, headlocked.

Q: How does it compare to Magic Leap, since they are both tethered? So quick answer, it was pretty much the same. I got used the Magic Leap tether pretty quickly and same with the Nreal. What is better about the Nreal is of course, I can plug it into various devices and I have an usbc extension and that gave me extra room. But from a user point of view. it was a non issue. Somewhat like wearing wired headphones, you figure out the best way to route the wire, then you forget about it. The only case where I can see the wire becoming an issue is if I was doing a high activity experience, like beat saber or something like that. And even then, I’m sure I could figure something out.

Q: How is DEX in AR? Did you not just read this post?! 🙂

Any other questions? Let me know and I’ll add them.

Share and Enjoy !

Shares
]]>
https://www.perivision.net/wordpress/2020/06/nreal-ar-glasses-day-o-the-good-and-the-bad/feed/ 0 10194
View 2D Android apps in VR with PhoneCast – S8 only https://www.perivision.net/wordpress/2017/09/view-2d-android-apps-in-vr-with-phonecast-s8-only/ https://www.perivision.net/wordpress/2017/09/view-2d-android-apps-in-vr-with-phonecast-s8-only/#respond Fri, 15 Sep 2017 15:08:56 +0000 http://www.perivision.net/wordpress/?p=10051 Read More]]> Here is something I have been waiting for, the ability to use my 2D Android applications and 2D games in VR.  The way it works is a 2D window is provided in your VR environment.  Within that window you can run Android apps!  Sweet Right? This GearVR App is call PhoneCast which is already out there.  Currently the applications you can use have to be white-listed.  However, all of the current apps are video streaming services, something that does not really interest me.  However, I got to test a new version that has a ‘labs’ area where instead of white listing apps, certain apps are ‘black listed’.  That means there are certain apps that are not allowed to run, but the rest are up to the user.  So of course, some apps may work, some may not.  That is why its called ‘Labs’ my friends.  The everyday apps seems to work fine, email, google photos, PowerPoint.  Typing with the keyboard was not fun, but this is to be expected.  I will say looking at PowerPoint is way better with the large display then using my phone.  As much as I like productivity apps, I really wanted to try a few games.

  • Quick note.  Look likes it US phones only running the Qualcomm chip for now.

Some games seemed to have a glitch the first time I would try them.  I tried Fallout Shelter first.  Did not load the first time, but worked fine after that.  Not sure what the story is here. Same with Pokemon Go. Did not work the first time, then works fine.  However,  it locks to portrait and when using AR, the camera does not display correctly.  I turned AR off and it was fine after that. few other games seemed locked to one view port or another, but that was not too much of an issue after I got used to it.  I would make the window a bit smaller for portrait games, larger for landscape.

I then tried Monument Valley, a favorite of mine.  This took 3 tried, but again, seems to work after that.  That game play was even nicer with the view port set to the largest setting despite portrait mode.  I have it smaller in my screenshot.  It took a little bit of effort to get used to the controller, but I got it down soon enough.  I do not like this screenshot that much, but for some reason, screen capture and video capture are iffy with PhoneCast.  There is a video of this at the bottom of the post as well.

What was really cool was trying ARCore apps in PhoneCast.  It ‘mostly’ works, and when I say mostly, that is because the viewport is not locked to the head. If you check out the video below you will see what I mean.  Still, VERY cool! I tried the first ARCore app that was released, ‘Atom Visualizer‘ from Signal Garden.   I contacted the good people involved in this project and made a request, we’ll see how it goes.  If they can lock the view port, this would be really fun!  Have a look for the video at the bottom of this post.

Here is a Pro tip for you.  The controller takes a bit of time to get used to, so don’t get put off.  Keep working at it.  BTW, there is not pinch to zoom.  Something I found out on Fallout Shelter.  Hopefully they will find a work around.  I suggested they should map something on the controller for zooming and zoom around the center view.  That could take some work though.

So there you go!  Enjoy.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2017/09/view-2d-android-apps-in-vr-with-phonecast-s8-only/feed/ 0 10051
XR and the Continuum of Immersion in eXtended Reality https://www.perivision.net/wordpress/2017/02/xr-and-the-continuum-of-immersion-in-extended-reality/ https://www.perivision.net/wordpress/2017/02/xr-and-the-continuum-of-immersion-in-extended-reality/#respond Sun, 05 Feb 2017 23:46:44 +0000 http://www.perivision.net/wordpress/?p=10034 Read More]]> Back in 2016 I did a presentation on the Future of VR.  On one slide was titled: VR, AR, MR… XR?  Basically wondering how to describe yet another way to experience VR where Virtual World, is mapped to the real world, but its fully immerse, but one or two real things pop through.  Ug.  So I started to refer to the whole continuum of VR-AR-MR as xR.  Seems this was not an original idea, lately I have been seeing XR used in the press to refer to Extended Reality to represent the set of the various acronyms we have been using.  Great.  But that is not solving the problem. We still have this alphabet soup of R’s.

So what I have now starting to do is not think about something as VR, AR, HUD or whatever R, but instead of levels of immersion. By thinking of the application of XR, I think we get away from the traps of which XR we want to use.  You have fully-immersive, as with VR, semi-immersive like AR and MR and non-immersive like various AR and HUD systems.  By thinking about it in this way, we do not worry about which of the XR’s we are talking about, but instead what is the immersive quality of the simulation and how does that affect the users experience?

For example; pain aversion.  It does not matter if its AR or VR, what matters is if the patient’s attention is sufficiently redirected to reduce pain and anxiety. A pilot does not care if you call the advanced helmet display a HUD or AR, but instead, does the pilot understand the information in context to the landing and how the weather is affecting the aircraft’s approach vector.  Also how does one interact with the content?  Do we need deep haptic immersion or is simply viewing with an HMD or even holding a display, like a phone, good enough?

When thinking about which of the various R’s in XR we want to explore, we need to start with the use case.  What is it we are trying to achieve? Far too often I hear people talk about using VR for this or that and I have to keep asking; WHY?  What are you gaining from going into a VR simulation when looking at that same content on a screen is actually better HCI. “Here is the idea; you can see excel data as dots in VR!”  ‘Great, but I still need to read the numbers…’  “Oh you just gaze over dot by bot and you can get the numbers.” Sigh. This was an actual conversation BTW.  If we were talking about thousands of values, perhaps, but it was any excel sheet.

When exploring any XR technology, lets not be a hammer looking for a nail.  Instead lets ask, How would immersion improve the experience?  Then when we have that answer, we should ask, what level of immersion is best suited to that task.  THEN we can determine which of the many XR’s we should employ.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2017/02/xr-and-the-continuum-of-immersion-in-extended-reality/feed/ 0 10034
My Experience at the world’s first Holographic Hackathon https://www.perivision.net/wordpress/2016/05/hololens_hackathon/ https://www.perivision.net/wordpress/2016/05/hololens_hackathon/#respond Sun, 29 May 2016 06:34:37 +0000 http://www.perivision.net/wordpress/?p=9975 Read More]]>  

  So this past weekend I went to Seattle to participate in the first ever Holographic Hackathon. This event sponsored mainly by Microsoft gave a select number of lucky individuals the opportunity to work with the Microsoft HoloLens device. As an organizer of hackathons for the past few years I was beyond excited to be a participant for a change. True this weekend was not the first chance for me to be around the device, we had several at our AEC Hackathon last month in SoCal, but it was the first time I had an uninterrupted weekend dedicated to developing for it.

For those that are not familiar with this augmented reality headset, see the video below from Microsoft’s Build 2016 keynote.

Hackathon Day #1

 

The Holographic Hackathon was held at Fremont Studios just north of downtown Seattle. The venue was a super cool open space with blue lights and black curtains along the walls.

The crowd was an interesting mix of software & Unity developers, audio engineers, artists, and non-technical professionals from various industries. I was pleasantly surprised to see several folks from the AEC (Architecture, Engineering, & Construction) space present and over a dozen others that have participated in either the VR or AEC related hackathons I help organize in Seattle. This diversity of skill sets and knowledge excited me more about the weekend.

 

The Friday evening started off with a networking mixer and then with an introduction from the Microsoft HoloLens team, Seattle VR, and Windows Holographic User Group Redmond (WinHUGR) Meetup groups. They laid out the agenda for the weekend, shared with us some great resources to help our hacking, and then opened it up to team formations. People wrote their ideas on a huge paper tablet and either stood at their table or walked around looking for team members or teams to join. While I came with a team of three, none of us are pro Unity developers (the main developing environment for HoloLens) so we actively sought someone who could assist our project. After giving the pitch to a few visitors to our table, we finally nabbed a Unity developer named Nicholas Abel that was excited to help us with our construction related application. After teams formed, HoloLens were distributed two per table and wasting no time we setup and begin holo-hacking away.

Our Application
The goal of our team’s weekend project was to create a HoloLens version of a building application my tech team at IDEAbuilder has developed for VR (video) and browser based web3D using the Kinect (Wired article). This application uses digital fabrication ready 3D models of building components to 1) give the user/general contractor the ability to place these items in a scene to build a structure and 2) provide them a price of our material and machine time costs for fabrication at our factory. Think of the application like a Lego builder app but with pricing information and other data for building real houses.

Getting Started
As stated, I am not a Unity developer. Sure I can do basic things in the game engine but as a open standards web guy with a focus on real world industries, it is not my tool of choice. While the rest of my team got hacking away at the project, I began building up my knowledge of the tool for HoloLens development via the tutorials at Microsoft’s Holographic Academy website. Hands down this site is among the best resources I have ever accessed for learning anything tech related online. The video and text tutorials were very clear and gave step by step instructions on how to develop for the HoloLens. These tutorials are so well put together that I feel anyone, even if completely new to Unity, can get up and running with development for the device in no time. I was so excited about the progress I was making with the device that time flew by. Knowing it was going to be a long weekend, I called it quits a little after midnight.

Day #2

 
I arrived Saturday morning right after the beginning of the day’s presentations that were going to be by various members of the Microsoft HoloLens team. While I was interested in what the presenters had to share, I was more excited about getting back into Unity for more HoloLens development. Wait, did I just say I was excited about getting back into Unity development?!? In all the years of knowing about Unity (David H. is a friend) and times I have HAD to use it I have never been excited about doing so, yet now I was. Seems I have the fever for HoloLens development and I will learn whatever tool I must to do so.

By mid-day I had created a few different HoloLens examples and even had time to show some others how to build for it as well while the rest of my team cranked away at our project. One application I made showed the digital fabrication model of a home IDEAbuilder helped build in Tahoe AR’ed into the room. Seeing this model in the HoloLens impressed the hell out of me as it is a lot of polygons (750,000+) to render in any mobile AR device. While not accurate in its scale, the model still took up a huge amount of the venue’s space. The video below of this app was recorded with the HoloLens’ video record feature.

I continued my HoloLens education for the rest of the day and night when not helping my team members with items related to finishing our project. I was having so much fun learning and building that 8:00 AM Sunday morning snuck up on me. This is when I realized I should probably nap for a few hours since I pulled an all-nighter and I face planted on a big super comfy beanbag chair.

Day #3

I awoke a couple hours later to jump back into it with my team. The energy of the room was intense as everyone was getting their projects and presentations ready for the hackathon deadline. At little after 1:00 PM the presentations started and what a variety of projects they were.

There were close to 20 teams and HoloLens apps for medical, storytelling, shopping, and more. There were even a few AEC related apps by AEC Hackathon alumni. Cody “Kick-ass” Nowak’s team made a cool AR measuring app and Willard “the Wizard” Williams showed what architects can do with the device. I could write a whole post on what all the teams created but recommend you check out what they have shared on Twitter using the #HoloHacks hashtag and here at the HoloHacks Facebook page. I hope the Microsoft team puts the videos of the presentations online as well!

This video shows some presentation slides and video footage taken in the HoloLens of our Wall Builder project.

Although our team didn’t win any of the three categories and receive the Unity Pro license prize, we all feel the weekend was a huge win and success in itself. I have much love for my team members Chris, Nicholas, and Greg. Thanks guys for all the hard work and awesome time. We really pushed up the bar for innovation in one of humankind’s oldest industries.

I learned a lot, made new & saw old friends, and am definitely excited about developing for the HoloLens. Our team is going to polish and take our app much further. I will publish the hackathon version of the Wall Builder app to the HoloLens store in the next couple weeks and update this post with links and make announcements on social media for those interested in trying it.

I thank Dona Sarkar, the Microsoft HoloLens team, and other organizers that made this the best hackathon I have ever participated in as an attendee. I look forward to participating in future HoloHacks and using this device to augmented and improve my reality. Now I just have to get my own to keep the innovation going. 🙂

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2016/05/hololens_hackathon/feed/ 0 9975
Microsoft Builds the future of Augmented Reality with HoloLens https://www.perivision.net/wordpress/2015/04/microsoft-builds-the-future-of-augmented-reality-with-hololens/ https://www.perivision.net/wordpress/2015/04/microsoft-builds-the-future-of-augmented-reality-with-hololens/#respond Thu, 30 Apr 2015 05:54:43 +0000 http://www.perivision.net/wordpress/?p=9796 Read More]]>

At its Build Developer Conference in San Francisco today, Microsoft provided some new details on its HoloLens virtual reality device it announced back in January. HoloLens is a head-mounted, holographic computer with a depth camera that provides a mixed-reality experience for a range of applications. I briefly covered the HoloLens in a previous post and was hoping Microsoft would share more about the unit today. It seems someone in Redmond was listening to my geek prayers.

Today Microsoft showed off various HoloLens applications that work with Windows 10 and even one that controls a Raspberry Pi robot. They also announced support of HoloLens by the Unity3D game engine.

The Verge has a great recap of the day’s announcements and HoloLens in the video below.

Another demonstration today during the Build Conference keynote session showed the integration of HoloLens with Trimble’s SketchUp 3D modeling software and the Trimble Connect collaboration platform. While these applications are still in development, it really gets one thinking about the HoloLens use in built environment related industries. It will be exciting to see what AEC Hackathon hackers create with this device when it is made available to developers.

I will be following HoloLen’s development as Microsoft has definitely made this an exciting time for augmented reality.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2015/04/microsoft-builds-the-future-of-augmented-reality-with-hololens/feed/ 0 9796
Top 10 Apps for the Gear VR HMD https://www.perivision.net/wordpress/2015/03/top-10-apps-for-the-gear-vr-hmd/ https://www.perivision.net/wordpress/2015/03/top-10-apps-for-the-gear-vr-hmd/#respond Fri, 06 Mar 2015 03:17:53 +0000 http://www.perivision.net/wordpress/?p=9685 Read More]]> gear vr back(Note, I’m posting this now, but its not done. But what I do want is feedback. Good list? Bad list? you have a better top 10? Tell me.) I finally got my Note 4 and start loading apps on the Samsung Gear VR and here is my top ten favorites.  Normally I do not include games in my top 10 but with the Gear VR I will make an exception.  Before I get to the list, a comment.  There are a number of Apps that show 360 video and images.  I’m going to list this as one item because basically, its the quality of the video that makes or breaks this experience.  Some of the video I saw was poor and really did not take me into the virtual space where others, like the Cirque du Soleil and the Patrick Watson music demo really make me feel like I was there.  As for the other VR applications I played with, non of them really felt like they were really taking advantage of what you can do in a VR helmet while at the same time, not making the user feel sick.  This is a very difficult feat to achieve.  Creating an experience where all the action is basically in front of you and your are motionless in the environment could also be achieved with a large screen and really good 3D glasses.  However, if you create movement in the VR scene that is too far in conflict to the fact your ears and body are telling you that you are not moving will give your user VOR, better known as VR sickness quickly.

A final note about the Gear VR and the lenses fogging up.  Every time I put this one the lens fog up on me.  To fix this I went to a sky shoppe and bought defogging liquid for the lens.  It works great for one use, but once it drys, then you are back to where you started.  This is a first try at a VR HMD for Samsung and they made it clear this is a developers version, so I’ll give it a pass for now.

Oh and use earplugs.  Worth it.  Many of these demos lose something when playing sound through a speaker.

Unlike other sites, I will put the list down here first, then I will create a page per app because each one make the list for a different reason.

10.   Playhead

I was really on the fence with Playhead because it did not appeal to me, but I appreciate the effort and the imagination that went into this effort.   I’m sure this would be higher on other peoples lists.

9.  DarkNet

8. Titans Of Space

7.  Vanguard V  :  Nightitme Terror  : Lunsee

6.  Dreadhalls

5.  Anshar Wars

4  Herobound: First Steps

3. Matterport VR : Gyeonju VR Museum

2.  Esper

1.   360 Video’s .  BTW the NBA videos look good but still wish the video was a bit more HD.  In a few weeks we are suppose to be able to watch the All Stars game.  Not sure if we get to see the whole game, or highlights or what.

 

Temple VR did not make it because they violate a basic rule in VR, do not move a users point of view too fast.  When you dodge from one side to another you will get VR sickness pretty fast.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2015/03/top-10-apps-for-the-gear-vr-hmd/feed/ 0 9685
REAL 2015 – Where the Sensor meets the Maker https://www.perivision.net/wordpress/2015/03/real-2015-where-the-sensor-meets-the-maker/ https://www.perivision.net/wordpress/2015/03/real-2015-where-the-sensor-meets-the-maker/#respond Wed, 04 Mar 2015 00:31:57 +0000 http://www.perivision.net/wordpress/?p=9728 Read More]]>  This past week was the REAL conference, an event organized mainly by Autodesk to explore the convergence of the professional 3D sensing, making & visualization industries.
 


From the website:
“REAL is both an exclusive executive summit, REAL TALK, & a world’s fair of cutting-edge 3D demos, REAL LIVE.

REAL is new and different: an immersive, hands-on, high-level gathering in a historic venue with a unique program.

REAL is real people, doing real-world work with reality tech.

REAL is Reality Computing.

WHO
REAL is 500+ leaders and innovators — professionals from across industry, investing, research, and media.

REAL brings together real work spanning disciplines from:
Architecture to Art,
Engineering to Entertainment,
Manufacturing to Media,
Heritage to Health, and
Sports to Science…
REAL is executives & engineers, developers & designers, inventors & investors, architects & artists, makers & meteorologists, surveyors & scientists, entrepreneurs & educators.

WHY
 From drones to autonomous cars, industrial robots to major engineering works, and game consoles to tomorrow’s mobile phones, 3D sensors are suddenly everywhere. And several decades after first grabbing headlines, VR and 3D printing are hot again, attracting billions in investment, and moving beyond early adopters to professionals. But it is the sum total, where sensing meets making, where big change is brewing.

While the ‘Internet of Things’ grabs headlines, a 3D revolution is quietly building.”

Although I only was there for one day, this was quite the event and I rank it among the best I have ever attended. Yes it had cool exhibitors and great speakers, but my high marks come from it bringing together communities that normally don’t mix, even though they are complementary and or share technologies. Most parts of the ‘3D life cycle’ were present.

Autodesk pretty much owns the 3D modeling tools space, so 3D creation from that standpoint was in the house if not directly represented on the expo floor. Most, if not all, of the 3D creation was from scanning and capture technologies and companies like Leica, Matterport, Occipital, and Floored.

 

 

 

 

Companies like Arup and Autodesk showed off interactive 3D and VR applications while immersive technology companies including IrisVR and Metaio dazzled folks with virtual and augmented reality demos.

 

 

 

 

A little light on the ‘Make’ side, the event did showcase some digital fabrication art installations with Fathom and a few other companies demonstrating how 3D and scanned data can be used for digital fabrication (mainly 3D printing).

Topping things off on the last day, Matt Sonic and the San Francisco Virtual Reality Meetup group had their eighth meeting at the close of the REAL event that included thought provoking presentations and VR devs showing off some VR demos (unfortunately none related to the theme of the conference).

This event was a great #1 and I can’t wait to participate in the whole event next year to see what 3D technologies they invite next. This is definitely an event that as it grows, the world of 3D is going to get very REAL.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2015/03/real-2015-where-the-sensor-meets-the-maker/feed/ 0 9728