"True" 360° Views integration into the walkthrough, by editing in Workshop

Instead of having links from Mattertags that bring from/to 360 Views, or other "workaround" solutions, I propose here something more radical: a "true" integration of the 360 Views into the tour.

The default, is the "automatic" process. BUT, if we MSP want to make some work for a better final user experience, the Workshop could allow that. Here's how:

Treat a 360° View as an area with an artificial "floor" with walls, e.g. a half-sphere, or a rectangular box, whatever: it doesn't matter if it's not the actual shape, but MSP can choose the "size" of the floor: so let MSP use that floor (and eventually walls) to add "artificial scan points", manually - and these artificial scan have to be "connected" both ways to:

- a "true" scan point (i.e. brings to a scan point into the "normal" 3D model)
- other artificial scan point of another 360° View (i.e. brings to that other 360° View)

This is to build the following experience to the end user: when a visitor is in a 360° View, she can look around as usual, but can also see on the floor some circles (like the spots in the walkthrough), i.e. those artificial scan points that we have manually inserted.

Clicking on a circle, she goes to the true scan point, or to another 360° View.

The MSP knows the actual space, so can build the actual experience.

In the same way, let MSP add Mattertags into the 360° View.

That would allow to leverage on the 360° Views way better, to provide the seamless experience that a great virtual tour requires.

It's just a matter of allowing MSP to do some manual work to define the size of the artificial floor for each 360° View, create the artificial scan points, and connect each of them where they need to be connected: it's a small work compared to how the final users' experience gets improved.

If the MSP does nothing, the default is the "automatic" mode (Highlight reel) - as at the moment the algorithm has no way to know how to integrate the 360° Views into the walkthrough because they're just "detached" 360° pictures, so it throws them into the Highlight reel.

But by manually "telling it", it's totally possible to show them in the right way.



  • now this will be a huge benefit,

    Comment actions Permalink
  • Interesting! Dropping 360 'pins' but not attached to a mesh but a point on the image. Would be great if a 360 was 'independent' / 'stand-alone' from a 3D model.

    Would be great to have these with Mattertag links. Imagine you are in a resort driveway. A 360 lets you see all around. Three (or more) 360 pins. One for the hotel lobby model, one for the entrance to a restaurant model, one to the garden view 360s.

    Comment actions Permalink
  • How about this?

    First being able to manually place or drop the 360 view onto the model while scanning. Placing it on the 2D floor plan on the iPad.

    Second being able to draw a line between a "real" scan point in the model to the 360 view drop pin.

    Third you would be able to expand the sides of the line to create a rectangle and the model would use the "real" scan point to create the floor base height and the ceiling height but then use the width of the drawn rectangle to determine the width of the "hallway" out to the 360 scan point.

    So when you see it on the dollhouse view it just looks like a hallway leading out to the backyard or something.

    Basically you would be building part of the model by hand. It would be similar to using the trim feature, but to build a tunnel to the next scan based off the last scan's information.

    Comment actions Permalink
  • Being able to walk from point to point on outside shots so that you can connect a guest house with the main house would be phenomenal...... Very surprised that Matterport didn't think of a way to enable the ability in the app to tell the app a direction to follow 360 degree images so that Guest houses can me connected to Main houses. Or simply to navigate outside and back into the house without having to use the play feature. Matterport, you seem a little out of touch for what users want to see......

    Comment actions Permalink
  • Usually when something - even looking like "obvious" at a first thought - is lacking, it's actually not a matter of being out of touch for what users want to see.

    It's more a fact that anything, whatever one can imagine, is "feasible" (well, most of the time - sometimes it's not, for some reasons that cannot always be easily explained) but involves reasoning, decisions, time for development, tests, investments (also just "time" is an investment): nothing is really easy.

    So often it's better to ask ourselves when a new feature comes out and looks like it's missing something: is this better than nothing (because that could be better, so the choice is to release only when reaching that better state)? I prefer something that will eventually evolve in time based on feedback etc., while staying confident that the evolution will happen on many levels.

    Comment actions Permalink

Please sign in to leave a comment.