VR – As Seen Through PeriVision https://www.perivision.net/wordpress An Mobile centric blog ... Full of Tech goodness Fri, 27 Aug 2021 01:35:13 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 4666035 TikTok parent Co. ByteDance may buy Pico, why that is big news for AR/VR https://www.perivision.net/wordpress/2021/08/tiktok-parent-co-bytedance-may-buy-pico-why-that-is-big-news-for-ar-vr/ https://www.perivision.net/wordpress/2021/08/tiktok-parent-co-bytedance-may-buy-pico-why-that-is-big-news-for-ar-vr/#respond Thu, 26 Aug 2021 21:29:46 +0000 https://www.perivision.net/wordpress/?p=10258 Read More]]>

Well, seems the deal went through! Right now, its still a rumor, but let say the ByteDance Pico deal goes through. What does that mean for the AR / VR space?

Currently the consumer VR space is pretty much dominated by Facebook with the Oculus Quest. A very good device, lots of content, very affordable price. With the release of Workrooms, we are seeing a slide into the soft enterprise space. (more on Workrooms later). Ok, so why is TikTok and Pico so interesting?

TikTok has followed both FB and Snap by offering an AR development platform. Snap has already released a dev version of AR glasses that can take advantage of their AR platform. And remember, they bought WaveOptics as well, so clearly they are committed to creating an AR device in the near term. What is interesting is that they are positioned to be the first AR glasses company to sell in large numbers give Snaps reach. Although I’m certain FB will come out with consumer level AR glasses very soon, they do not have the same penetration that Snap has for user AR content creation. Given TikToks current reach on in the user creative space, it would not be hard to imagine ByteDance is looking to compete directly with Snap. And the two of them need to move fast to buffer FB’s AR ambitions.

So how does Pico play into this? ByteDance recently bought a gaming studio to help bolster their content pool, but I think this will mostly be for the Chinese market, VR is pretty much owned by FB right now. However, if ByteDance wanted to create AR light glasses somewhat like Snap, Nreal, and Lenovo, this would be a step in the right direction. I would not be surprised to see a follow up acquisition of Nreal, or failing that, MadGaze.

This gives ByteDance quite a bit power in the Chinese market.

  • Continue to expand the VR Consumer space in China perhaps with more gaming studio acquisitions.
  • Support AR pass-through and copy FB Workrooms to create a Chinese version. Remember Pico using the same X2 chip Quest uses, so its doable.
  • Create a set of AR glasses, either through acquisition or other work that Pico has been doing that we do not know about.
  • Leverage TikTok popularity to compete with Snap and soon FB in the western market for consumer AR.

So what is missing?

Right now, both Snap and perhaps ByteDance still have a content issue. Yes, AR filters are very popular, but that will not be enough to sustain interest. To succeed they need access to the consumers phone which offers FAR more content than any one walled garden can ever create. To do this they need to figure out how to take mobile 2D content and display and interact with it on the XR device. This is why I wrote about Samsung Dex for AR-lite in the past. But guess who else has a Dex like experience? Huawei.

I’m calling this now. Watch for a future agreement between ByteDance and Huawie for a mobile based desktop experience that works in both VR and AR.

Also watch for further moves by ByteDance to implement AR glasses. Its getting closer … perhaps 2023 is when we will see the first major plays into consumer AR.

Share and Enjoy !

Shares
]]>
https://www.perivision.net/wordpress/2021/08/tiktok-parent-co-bytedance-may-buy-pico-why-that-is-big-news-for-ar-vr/feed/ 0 10258
Snap Glasses are ok, but $500M for WaveOptics is really interesting https://www.perivision.net/wordpress/2021/05/snap-glasses-are-ok-but-500m-for-waveoptics-is-really-interesting/ https://www.perivision.net/wordpress/2021/05/snap-glasses-are-ok-but-500m-for-waveoptics-is-really-interesting/#respond Sat, 29 May 2021 16:34:59 +0000 https://www.perivision.net/wordpress/?p=10249 Read More]]>

Recently Snap dropped on us their new dev version of AR glasses. There was not much on specs, save 2000 nits (1000 per eye I guess) waveguide displays. Since its tethered I would guess its pretty much following most of the other AR glasses I have seen pop up based on the QC chip.
However, what was really interesting to me what the announcement from Snap the following day. $500M purchase of WaveOptics. This was a signal to me.

Typically announcements like the Snap DK glasses, or Lenovo, or even the Niantic announcement always feel like a soft foot in. Basically trying to gauge interest while at the same time work with developers to create content, but more importantly, shake out the bugs in the wearable as well as eco system for an AR experience be it a hardware, software, or eco system play.

M&A, however, always catches my attention. Yes companies like Apple and Facebook are buying companies left and right, however, when those companies that are not quite as washed in cash start spending, that tells me something.

I have been around for a while, seen the first great transformations, and thus VR bubbles, in the late nineties, and again back around 2016’ish. And after both great leaps in XR capability, it always seem to follow a hype that XR will explode on the market.

ABI Research Reports $2 Billion+ in Funding and M&A for AR/VR

Although I do not believe we will see growth like this prediction above, something I do believe is the market will grow, and in 2025’ish we will see a 3rd revolution in XR systems. I’m not the only one who sees this, in fact, its somewhat expected 2025 will be when we really see AR glasses. Eco systems are already being built and developers being invited in. Its 2025 and after that I think we will see a change in consumer habits and substantial market opportunities.

So returning to why I think Snaps purchase of WaveOptics is so interesting. To compete in the new AR market, you need 3 things: Experience delivery. In this case, light weight attractive glasses. Content. Simply having a few cool apps is not going to do it. You need just as much content as we have on our phones today. Finally, users. Does not matter how good your system is, if you do not have a user base ready to consume it.

The last two takes quite a bit of investment unless you already have access to the phone (be it Apple, Google, or a major platform player like Microsoft, or a major phone manufacture). So hardware is the 3rd part where you can make a play to create content for your eco system and hopefully build a user base. Snap has already tried this with their first foray in to camera only glasses. Didn’t work out all that well. But the concept is still sound. If you own the hardware for display and/or input, you have at least some control over the content created for AR consumption, be it wearable or currently, phone based. Thus the investment in WaveOptics.

Snap has a very good customer base. Its apps are popular. So if they can also create a good set of glasses that enhance the particular content that its use base already enjoys, it can find itself well positioned to insure its apps are part of any AR eco system in the future. And that is the key here. The future play. As we move into a new eco system that is part phone and AR glasses based, how do you insure relevancy?

$500M is no small M&A. Snap is betting that they can get enough glasses out there to insure content for their eco system is sustained. This current device is clearly a DK, but you do not drop .5B on a company unless you plan to build future devices.

In the future, near future, there will be an interesting battle of eco systems. Open system and closed. Apple will create a closed system as we can already guess. Niantic is creating a metaverse, as is Nvidia for engineers, as is Epic which just raised $1B for this effort. However, if AR Glasses can talk to phones like Bluetooth devices can, then the question will be, how do these glasses consume content and access these various metaverses? If for example, Google creates a protocol to allow any wearable AR device to access and display phone content, then those apps that are already matured for the AR wearable market will have a substantial advantage over other apps, and in fact, will find they can grow and capture new users with features that other apps that are not already matured may find themselves displaced. Those apps in turn, can decide which metaverse(s) to support.

This I can see being worth a .5B investment.

Share and Enjoy !

Shares
]]>
https://www.perivision.net/wordpress/2021/05/snap-glasses-are-ok-but-500m-for-waveoptics-is-really-interesting/feed/ 0 10249
View 2D Android apps in VR with PhoneCast – S8 only https://www.perivision.net/wordpress/2017/09/view-2d-android-apps-in-vr-with-phonecast-s8-only/ https://www.perivision.net/wordpress/2017/09/view-2d-android-apps-in-vr-with-phonecast-s8-only/#respond Fri, 15 Sep 2017 15:08:56 +0000 http://www.perivision.net/wordpress/?p=10051 Read More]]> Here is something I have been waiting for, the ability to use my 2D Android applications and 2D games in VR.  The way it works is a 2D window is provided in your VR environment.  Within that window you can run Android apps!  Sweet Right? This GearVR App is call PhoneCast which is already out there.  Currently the applications you can use have to be white-listed.  However, all of the current apps are video streaming services, something that does not really interest me.  However, I got to test a new version that has a ‘labs’ area where instead of white listing apps, certain apps are ‘black listed’.  That means there are certain apps that are not allowed to run, but the rest are up to the user.  So of course, some apps may work, some may not.  That is why its called ‘Labs’ my friends.  The everyday apps seems to work fine, email, google photos, PowerPoint.  Typing with the keyboard was not fun, but this is to be expected.  I will say looking at PowerPoint is way better with the large display then using my phone.  As much as I like productivity apps, I really wanted to try a few games.

  • Quick note.  Look likes it US phones only running the Qualcomm chip for now.

Some games seemed to have a glitch the first time I would try them.  I tried Fallout Shelter first.  Did not load the first time, but worked fine after that.  Not sure what the story is here. Same with Pokemon Go. Did not work the first time, then works fine.  However,  it locks to portrait and when using AR, the camera does not display correctly.  I turned AR off and it was fine after that. few other games seemed locked to one view port or another, but that was not too much of an issue after I got used to it.  I would make the window a bit smaller for portrait games, larger for landscape.

I then tried Monument Valley, a favorite of mine.  This took 3 tried, but again, seems to work after that.  That game play was even nicer with the view port set to the largest setting despite portrait mode.  I have it smaller in my screenshot.  It took a little bit of effort to get used to the controller, but I got it down soon enough.  I do not like this screenshot that much, but for some reason, screen capture and video capture are iffy with PhoneCast.  There is a video of this at the bottom of the post as well.

What was really cool was trying ARCore apps in PhoneCast.  It ‘mostly’ works, and when I say mostly, that is because the viewport is not locked to the head. If you check out the video below you will see what I mean.  Still, VERY cool! I tried the first ARCore app that was released, ‘Atom Visualizer‘ from Signal Garden.   I contacted the good people involved in this project and made a request, we’ll see how it goes.  If they can lock the view port, this would be really fun!  Have a look for the video at the bottom of this post.

Here is a Pro tip for you.  The controller takes a bit of time to get used to, so don’t get put off.  Keep working at it.  BTW, there is not pinch to zoom.  Something I found out on Fallout Shelter.  Hopefully they will find a work around.  I suggested they should map something on the controller for zooming and zoom around the center view.  That could take some work though.

So there you go!  Enjoy.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2017/09/view-2d-android-apps-in-vr-with-phonecast-s8-only/feed/ 0 10051
XR and the Continuum of Immersion in eXtended Reality https://www.perivision.net/wordpress/2017/02/xr-and-the-continuum-of-immersion-in-extended-reality/ https://www.perivision.net/wordpress/2017/02/xr-and-the-continuum-of-immersion-in-extended-reality/#respond Sun, 05 Feb 2017 23:46:44 +0000 http://www.perivision.net/wordpress/?p=10034 Read More]]> Back in 2016 I did a presentation on the Future of VR.  On one slide was titled: VR, AR, MR… XR?  Basically wondering how to describe yet another way to experience VR where Virtual World, is mapped to the real world, but its fully immerse, but one or two real things pop through.  Ug.  So I started to refer to the whole continuum of VR-AR-MR as xR.  Seems this was not an original idea, lately I have been seeing XR used in the press to refer to Extended Reality to represent the set of the various acronyms we have been using.  Great.  But that is not solving the problem. We still have this alphabet soup of R’s.

So what I have now starting to do is not think about something as VR, AR, HUD or whatever R, but instead of levels of immersion. By thinking of the application of XR, I think we get away from the traps of which XR we want to use.  You have fully-immersive, as with VR, semi-immersive like AR and MR and non-immersive like various AR and HUD systems.  By thinking about it in this way, we do not worry about which of the XR’s we are talking about, but instead what is the immersive quality of the simulation and how does that affect the users experience?

For example; pain aversion.  It does not matter if its AR or VR, what matters is if the patient’s attention is sufficiently redirected to reduce pain and anxiety. A pilot does not care if you call the advanced helmet display a HUD or AR, but instead, does the pilot understand the information in context to the landing and how the weather is affecting the aircraft’s approach vector.  Also how does one interact with the content?  Do we need deep haptic immersion or is simply viewing with an HMD or even holding a display, like a phone, good enough?

When thinking about which of the various R’s in XR we want to explore, we need to start with the use case.  What is it we are trying to achieve? Far too often I hear people talk about using VR for this or that and I have to keep asking; WHY?  What are you gaining from going into a VR simulation when looking at that same content on a screen is actually better HCI. “Here is the idea; you can see excel data as dots in VR!”  ‘Great, but I still need to read the numbers…’  “Oh you just gaze over dot by bot and you can get the numbers.” Sigh. This was an actual conversation BTW.  If we were talking about thousands of values, perhaps, but it was any excel sheet.

When exploring any XR technology, lets not be a hammer looking for a nail.  Instead lets ask, How would immersion improve the experience?  Then when we have that answer, we should ask, what level of immersion is best suited to that task.  THEN we can determine which of the many XR’s we should employ.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2017/02/xr-and-the-continuum-of-immersion-in-extended-reality/feed/ 0 10034
VRHackathon this weekend, Dec 2-4 2016. Mixer on Dec 1. Prizes!! https://www.perivision.net/wordpress/2016/12/vrhackathon-this-weekend-dec-2-4-2016-mixer-on-dec-1-prizes/ https://www.perivision.net/wordpress/2016/12/vrhackathon-this-weekend-dec-2-4-2016-mixer-on-dec-1-prizes/#respond Thu, 01 Dec 2016 15:39:14 +0000 http://www.perivision.net/wordpress/?p=10017 Read More]]> sfhackathonHere we go. Another VR hackathon happening in SF (Microsoft Reactor 680 Folsom St Ste. 145 San Francisco, CA 94107) I expect lots of Vives, Oculus and GearVR’s.  I should not have to explain to you dear readers at this point what a VR hackathon is.  BTW, the site says its sold out, but its not, use the RSVP link and/or show up either Thursday at the mixer at Razer’s SF office (201 3rd St #900, San Francisco, CA 94103) or Friday at the Reactor.  Sign up VRHackathon here.

I will stop in at some point on Thursday, but I will be there on Friday and Sunday when I will judge the best GearVRf project.  Oh, what is GearVRf?  In short, its an open source project to allow the creation of VR worlds using Java.  Prizes?  Of course!  (5) Gear 360 cameras will be given out.

Will there be devices we can use?  I plan on bringing 10 GearVR Note 4 devices.  Yea its an older device, but it works fine for hacks.

The GearVR Framework Project

The Gear VR Framework (GearVRf) Project is an open source collaboration based on the GearVRf open-source rendering library for application development on VR-supported Android devices.

GearVRf makes it easy to develop compelling VR applications:

  • Easy-to-use Java application development environment
  • Easy access to powerful VR technology
  • Does not require extensive knowledge of OpenGL and the Oculus renderer
  • Simple and effective rendering manager
  • Compatible with mainstream VR development tools

 

GearVRf_Advantages.png

 

The GearVRf API provides VR application developers with simplified access to Oculus SDK functionality, via the Java Native Interface and the GearVRf native library.

Now some details for the SF VRhackathon.

Its in San Francisco be held at Microsoft’s Reactor space, an interactive and modern office located just minutes from the Moscone Center and Powell BART station.

Friday
18:00 VR Meetup (Demos and presentations)
19:30 Program kick off (Presentations)
20:00 Hackathon Kickoff/Lightning Rounds/Team Formations
22:00 Adjourn

Saturday
9:00 Registration / Breakfast
10:00 Hackathon begins & Morning announcements
13:00 Lunch
19:00 Dinner
22:00 Adjourn for the evening

Sunday
9:00 Breakfast
12:00 Lunch
13:00 Hacking Concludes / Team Presentations & Judging
16:00 Winner announcements
16:15 Hacker Expo
18:00 Doors Close

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2016/12/vrhackathon-this-weekend-dec-2-4-2016-mixer-on-dec-1-prizes/feed/ 0 10017
Controllers for mobile VR will become standard. Thanks Daydream! https://www.perivision.net/wordpress/2016/11/controllers-for-mobile-vr-will-become-standard-thanks-daydream/ https://www.perivision.net/wordpress/2016/11/controllers-for-mobile-vr-will-become-standard-thanks-daydream/#respond Thu, 24 Nov 2016 21:49:32 +0000 http://www.perivision.net/wordpress/?p=10009 Read More]]> google-standard-controllerAlthough Google Cardboard, Samsung GearVR and the numerous copy cats have basically assumed the user will not have a controller, with exception to some GearVR titles that require a gamepad, the basic UI of most VR apps rely on gaze control or the track-pad on seen on the GearVR. And for a first try at VR, the trackpad was not the worst idea in the world.  It was cheap to make, there is no set up, nothing to charge and you will not lose it. All good points.  So good in fact, that I really hope future GearVR headsets keep the pad.  However, for immersion, UI/UX and general VR happiness, you really need a controller… and a controller you can see in the VR world.

For those of you who have yet to try the Daydream, its mostly a really nice Cardboard with a higher spec phone for better rendering and head-tracking.  (I talk about this in my previous post)  But what is really different, and IMOP better, is the controller. Its not the design of the controller is all that great, its the fact that its there! Having this sense of presence both from the haptics of my hand as well as my head movement greatly improves immersion and enjoyment.  The presence of a controller my not seem like a big deal, but it is.

What is really great about the Google Daydream is the commitment of a controller.  When you buy a headset, it comes with one.  Thus app devs can always rely on the fact a controller will be there. This certainty is what we are missing today.  And Samsung and other other headset makers can take cue and make it a requirement for their systems as well.  Unfortunately, as I understand it the Daydream controller will NOT work on other systems, but look for others to fill this market.

Also coming is the mobile stand-alone headset.  I do not expect to see anything beyond POC at CES2017, but I fully expect that towards the end of 2017, we should see a few come out.  These devices will have controllers as well, allowing devs to leverage previous work on the Daydream and GearVR systems into the new stand alone systems.

Also watch for 6DOF controllers as well..  This will be a bit harder to crack.  🙂

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2016/11/controllers-for-mobile-vr-will-become-standard-thanks-daydream/feed/ 0 10009
Daydream vs GearVR https://www.perivision.net/wordpress/2016/11/daydream-vs-gearvr/ https://www.perivision.net/wordpress/2016/11/daydream-vs-gearvr/#respond Wed, 23 Nov 2016 18:59:40 +0000 http://www.perivision.net/wordpress/?p=10003 Read More]]> daydream-gearvrThere has been a number of posts comparing the Google Daydream VR experience to the Samsung GearVR.  I have been using both for a while now and think I can offer a pretty good evaluation.

When I first tried the Daydream I had the same response most people had.  It was small, attractive and felt great. I really like the tactile feel of it, better then the hard plastic GearVR.  My only complaint was the field of view.  However, now that I have spent a good deal of time in it, my opinion has changed somewhat. When I first tried it the strap was VERY tight, however, that kept light bleed to a min and I never really felt the weight.  However, the next time I tried it, the tight strap started to really bother me, so I loosed it. That made it far more comfortable, but then light bleed was pretty substantial.  The fit also felt a little less complete and instead felt a bit top-heavy.  Once you get into the task you are doing, a game for example, the light bleed ‘sometimes’ goes away.  What I mean here is when I’m deep into doing something in the game, I do not noticed, but at transition or other points in the game where I’m not deeply focused, the light bleed comes back to my attention. Same with the fit.  When I’m in the middle of something, I do not notice, but at transition, I cannot help but feel like the headset it resting on my forehead and not comfortably across the face.

Now to the phone.  The head tracking at the beginning feels just as good as the gear. I have to really pay attention to notice any lag. However, for whatever reason, the phone heats up very quickly. The hotter the phone gets, the slower the system becomes.  Then you can feel the head tracking slow down just a bit.  Enough that if you are standing, you can start to get motion sick.

Now here is where the Daydream is great, and its nothing to do with the headset or the phone. It’s the hand controller.  The hand controller is a simple 3DOF motion tracked device but what this does is greatly enhance immersion because you can ‘see’ you hand movements in the environment and allow you to interact with object more naturally.  So far I have not seen an amazing implementation, but give it time.  This simple controller works very well, yes it drifts but centering is not a big deal.  I really wish the GearVR came with one. I fully expect someone at sometime is going to write a driver to allow the Daydream controller to work on the GearVR if Samsung or Oculus does not make one themselves. As simple as this is, the knowledge that ALL Daydream devices will have this controller gives developers confidence to design their apps with this support.

An issue with Daydream, which I also have with the Gear is drift.  As the phone heats up it tends to drift, typically to the left.  Seems like this is just a limitation of the internal compass for now.  I’m sure they will fix it in the future.

So my final assessment:  I would buy the Daydream for a casual user, but stick with the GearVR for anyone else. A key point however is that the daydream does not need to compete with the GearVR, its needs to be a friendly, accessible device that anyone can use.  And for Google, this seem like exactly the market they are after, so despite my complaints, I think Google nailed this one spot on.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2016/11/daydream-vs-gearvr/feed/ 0 10003
The BEST use of deep immersive VR for a promotion, wireless gear VR room positional tracking. https://www.perivision.net/wordpress/2015/12/the-best-use-of-deep-immersive-vr-for-a-promotion-wireless-gear-vr-room-positional-tracking/ https://www.perivision.net/wordpress/2015/12/the-best-use-of-deep-immersive-vr-for-a-promotion-wireless-gear-vr-room-positional-tracking/#respond Mon, 14 Dec 2015 00:35:42 +0000 http://www.perivision.net/wordpress/?p=9950 Read More]]> VR tunnel lead imageI just saw a video that blew my mind. Its not JUST that its a great idea, nor just looking like lots of fun, nor REALLY great combination of xbox tracking and gearVR to allow room scale VR, but the APPROPRIATENESS of the marketing event!  Get ready for some serious gushing over the best VR marketing experience ever!  There are a few levels from where we can look at this video, lets go through some of them. (video at the end)

Technical

vr tunnel systemThe obvious thing is technically what they pulled off.  Using multiple xbox sensors (I’m not sure how many) they are able to do full body tracking.  Because the user’s vision is completely obscured by the HMD, they can place the xbox sensors and light it to provide the maximum effectiveness for body tracking. Hand and head position most likely are sent via closed wifi to the gear.  From there its rendered in (Unity?). As you can see in some of the shots, not only is head positional tracking going on, you can see your hands on the rope as you walk across!

vr tunnel middleThis is great. Now add wind, cold air and sound (notice there we saw no headphones so the sound was most likely piped in), and you have a very deep immersive experience,  Because the Gear VR is wireless, the user is not restricted by cables.  In fact, once you are immersed, and given all the other sensory input, your brain can have a real fight deciding what is real and what is VR.

vr tunnel prepEven better is the fact that you put on the HMD outside the room, (again as best I can tell) the user never saw the real bridge that was constructed, so the bran has no other cognitive interpretation of what is going on except for what they see. BTW, don’t misunderstand me, with exception to perhaps one person, they KNOW its a simulation and short of flinching and perhaps lowering one’s body, they are enjoying it for what it is.

Marketing

I would guess most people who watch this video, especially those in the VR world would mostly be thinking about how great of an immersive simulation this is, and move on. But what really caught my attention, and what I like and admire just as much about this concept, is how well this works from a marketing point of view. Remember, they are releasing a (new?) instant latte. The sell point for this product is you can enjoy a cafe quality latte anytime, simply by using a microwave.  Ok, so what are the customer scenario where they may want this?   Most likely at home and perhaps the office.  I think most offices offer coffee and if not, typically there is a cafe near most large office complexes.  But at home, not so much.

vr tunnel fallingIn the experience we saw in the video, you are asked to cross this long dangerous bridge on a distant exotic mountain top in a blizzard!  Many dangers face you on your trek to get to the other side where a microwave waits for you.  Each step on this old decrepit bridge becomes more dangerous as the winds increase, and rocks start to fall all round you!  Somehow, through perseverance and a bit of luck, you make it to the end where, you find a microwave, (yes people do randomly leave microwaves in the Himalayan Mountains, just go with it). You open the door and a wonderful hot beverage is waiting for you to warm you up, body and soul.

vr tunnel microwaveNow, think about a long snowy winter train ride come where you have to walk the last 3 or 4 blocks.  Not too different eh?  And if you like lattes then knowing you have a nice warm drink waiting for you could be that much better.  So the sell point here is to associate a tough trek with a warm reward.  This experience drives this idea to the customer is a very real and tangible experience. An experience they will not soon forget, and share with their friends, including the idea of the microwave latte and the branding that goes with it.

Execution

I have not experience this but having been doing VR for so long, and having tried everything from the CAVE to the wind bird simulation, its not hard for me to put myself there and have an idea of what this would feel like.  I love the fact that this experience really allows you to ‘walk’ through it.  Yes we have plenty of simulations where you can walk, but typically its flat ground.  But here you have a bridge, you have rope to grab.  As the wind starts to push you back, the IMU on the headset reflects the sublet changes in head position because of it.  As you walk, the scene walks.  With the xbox sensors and the very well lite room, body tracking can be very precise.  I’m sure a dedicated local wifi was running allow the fastest possible update time that can be done with current technology.  Based on the headset graphics it looks like Unity is running this, but again, because of all the various sensors being reinforced with the same cognitive interpretation; you are on a bridge, in a cold wind storm, with the proper sounds, the less than realistic graphics does not matter so much. For some people, its too much!

BTW< this could have been staged, not sure.

BTW< this could have been staged, not sure.

 

A small note here.  Notice no gloves on peoples hands.  They are asked to walk through this cold experience with no gloves.  What that means, and you can tell at the end when people come out, is that they are using the drink to warm themselves.  A GREAT bit a experience detail.  Even after the HMD is off your head, YOU ARE STILL COLD! The experience is still with you, and the reward of the hot latte continues to reinforce the benefits of buying this product.

Appropriateness

vr tunnel outsideSo here I want to reinforce something about VR and marketing.  Like I said, I have been around VR for a long time, 20 years plus.  I have seen MANY marking events that try to use a VR experience to create an association of “high tech” or “cutting edge” or “next generation” to a product, some successful, some not so much.  In all cases however the VR experience has little to no association with the value of the product. Further, once the VR experience was over, that was it.  I typically remember what I liked or was annoyed by in the experience, to date I cannot remember a single product from one of those marketing campaigns. This is different.  Even though I did not experience it myself, the whole idea of walking through a blizzard to be rewarded with a hot latte seems to stick.  As the more real the struggle to reward is, the more likely you will have a favorable memory of the reward. What is appropriate about using VR here is that VR has nothing to do with campaign.  It is simply the best tool to achieve the maximum positive reward association with the product. Not trying to be ‘new tech’ or ‘cutting edge’ or ‘cool’.  well ok, a bit cool.  But instead simple marking 101, create a positive association with your product, and VR in this case, was a great choice.

Now, check out the Video.

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2015/12/the-best-use-of-deep-immersive-vr-for-a-promotion-wireless-gear-vr-room-positional-tracking/feed/ 0 9950
Samsung Gear VR Internet is out. And here are 10 things that would make it even better https://www.perivision.net/wordpress/2015/12/samsung-gear-vr-internet-is-out-and-here-are-10-things-that-would-make-it-even-better/ https://www.perivision.net/wordpress/2015/12/samsung-gear-vr-internet-is-out-and-here-are-10-things-that-would-make-it-even-better/#respond Wed, 02 Dec 2015 07:30:10 +0000 http://www.perivision.net/wordpress/?p=9942 Read More]]> G-VR4_MainSo VR Internet, called ‘Samsung Internet’ for some reason, just came out today.  It is a Beta so some things we are going to overlook, like youtube not always loading and playing and drift; instead lets focus on the things I think will make it a great 1.0.  Before we start, PLEASE SUPPORT .GIFV.  If you cant, then expose the URL bar so we can change .gifv to .gif.  I’m guessing this is a Beta thing too.  OK, lets start the list of things to make Gear VR Internet great!

1)  Increase the fonts!

Until we get 4 k screens, we are going to have to deal with the screen door effect and our current 2K’s are not great for reading.  Increasing the font will help, but remember to let us choose it.  And while we are at it, let us change the font itself.

2)  Now once we increase the fonts, we need a bigger CURVED screen.  

I’m glad I can zoom in, but I will want to zoom in even more, but the more we zoom in, the worse a flat screen looks.  Check out Oculus VR desktop.  Learn from that.

3) Now lets talk UI.  Let me edit the URL bar.  PLEASE!

4) It does not make sense to go to ‘TABS’ to create a new window.

I understand we are creating a new ‘tab’ but its not intuitive.  I should have that option at home as well.

5) And speaking of Home, where did the recommended list go?  

Now its filled with stuff I was testing out.  If I wanted to save it I would have created a bookmark.  Please provide a way to get that back if we choose.

6) I really like the Void theater mode when viewing Videos.  

I can haz?

7) Speaking of video, lets sideload into MilkVR

I like MilkVR, its my preferred video viewer.  When selecting a video, provide an option to send that URL to MilkVR.  Even better, remember where we came from and when done watching in MilkVR, send us back to VR interwebs.

8) Swipe keyboard input would be great here.

9) Something seems to be off trying to find the <body> tag and zoom to it. 

Reddit is my test case.  Zooms in way too much. I have to back out each time.  BTW, title font on Reddit reads really well on the gear.  Clue!

10)  and finally … WebGL!  And Web3D while we are at it.

11) Rename this to VR Interwebs.  No reason.

BTW, Streetview would be nice.

Now, I think I’m going to make a version of Newssnacker.com just for the Gear!  Keep your eyes open.

 

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2015/12/samsung-gear-vr-internet-is-out-and-here-are-10-things-that-would-make-it-even-better/feed/ 0 9942
Review of the first major US live sports event in VR https://www.perivision.net/wordpress/2015/10/review-of-the-first-major-us-live-sports-event-in-vr/ https://www.perivision.net/wordpress/2015/10/review-of-the-first-major-us-live-sports-event-in-vr/#comments Wed, 28 Oct 2015 22:13:31 +0000 http://www.perivision.net/wordpress/?p=9924 Read More]]>
nba-vr

http://www.nbcbayarea.com/news/local/First-Time-NBA-to-Offer-Live-Virtual-Reality-Streaming-of-Game-Warriors-vs-Pelicans-337523961.html

So it finally happened.  We got to see a major game, live, in VR.  I wish I could say it was great, it wasn’t, but it was good enough to MORE then prove that live or even post broadcast games in VR have amazing potential, and I waiting with money in hand.

First, what went wrong.  Only 3 things really.  The stream was not strong enough to keep up with the action.  When the players were mostly standing around, it felt like the feed could hold about 25fps.  However, once the movement started to increase, the compression would be reduced and soon I was watching 15 and even 10 fps.  That’s at least when it does not lock up on me.  (To fix, I simply restarted the app)

Second, WHY in the world are we sitting so damn low?  And ‘thankfully’ someone told the camera guy to move after the half.   Really, its a camera.  Get me court-side and 8 feet up.  Also I understand the issues with trying to cover a large court with one camera, but you can zoom in a little bit more.  I don’t mind moving my head a bit more.  The players looked so small, combined with poor frame rate, I thought I was watching a video game.

nba-vr-cameraguyFinally, announcers and scores.  There should be an easy way to get that feed synced and a simple 2D overlay.

Now the good parts..   For all its limitations, it was a FANTASTIC taste of what is to come.

It was nice to look where I wanted to look and really feel the pace of the game.  I did not feel ‘there’ but I could imagine now what it will feel like.  In other words even though the NBA tip off was not a ‘immersive’ experience, it was close enough that I know what it CAN feel like and I want it even more!!

The other key takeaway is the issues I listed above are not that hard to fix. Once we have 4k screens; partner with a company with a better compression algorithm, get the camera off the floor, add some voice and data overlays are we are at v1 for live VR sports.

BTW, phone overheat was still as issue.  It was easy enough to fix by simply turning on a fan and putting on a sweatshirt, but the demands of a 4k stream will push the hardware even harder, so for now keep a fan nearby.  If you do not have that option, turn off everything but the wifi and bring the brightness down a bit.  It will help a lot.

And why did we not have the option to watch this on oculus?  Yes, its WAY better to have a mobile device so I could watch it where I wanted to, but if I could get better performance out of the oculus or the vive, why not offer that?

All and all, good job at the first test.  I know it will get better and better.

 

 

 

Share and Enjoy !

Shares







]]>
https://www.perivision.net/wordpress/2015/10/review-of-the-first-major-us-live-sports-event-in-vr/feed/ 2 9924