Life in Metavision or how I mainlined a William Gibson novel.
First of all lets look at the word Image, it breaks down to I MAGE, as in I the magician create a reality for you to perceive.
The words magic and magician seem the most fitting to describe my Meta 2 augmented reality demo but I may be a bit neuromantic . In 2013 I was blown away with my first experience in Oculus Rift.
In 2016 I say that we need to be truly aware that AR is by definition an entirely different and to a degree an entirely more transportive experience. I was ready for VR when I first used an Oculus Rift as it placed me in a more immersive form of media I have already experienced such as gaming and movies. Meta 2 made me feel more as if I was the computer itself and my experience is just part of a coming future mesh of the organic and cyber.
Don't get me wrong I love VR and VR is the now but what Meta is working on is like mainlining a William Gibson novel. After my Meta 2 demo I felt strange not in a nauseated way but an eye opening almost nervous way that the future of Neuromancer is taping me on my shoulder letting me know the new future of tech is moving faster than the current rate of my consciousness. I need to be faster, smarter, and more responsive to accept it. I actually took a quick break after my demo and after that I needed a VR experience just to bring me back down to current reality.
The original Meta AR headset was a great proof of concept with a great 40° field of view much like Microsofts Hololens is today. That was two years ago. The feedback the Meta team received from this was that it was good for R&D, good for innovation, good as a glimpse into the future of computing, but not good enough to replace the screens that characterize our lives in technology.
The teams answer was to tap in some of the most innovative AR minds globally to develop an optical engine that can create a 90° field of view. The Meta team also tapped into the Google Project Tango framework to acquire the expertise of Alan Betran in developing a sensor array capable of dexterous hand interaction. With over 600 patent claims for the technology, the Meta team now offers us a truer vision of their future of reality.
What is the Meta vision for the future of computing? To shift the paradigm of computing from a small scale flat experience to a visceral reality of interaction and point of view.
My Meta 2 experience was curated by the ultra charismatic Ryan Pamplin, who based on his "joke" that he is wearing Meta 9 contact lenses, is either a robot or from the future where Minority Report is the reality. Legit it was the same experience as the gif bellow but with a headset. Meta 9 will most likely blow that away. I would say this current state of developmental AR can't be done justice by looking at in consumer terms. I wrote my post from a cyberpunk perspective. I would say that if VR is in the palm pilot era then AR is in just only the dot matrix printer era. Imagine controlling video art on a REAL mountain with your eyes. Now thats the future.
Actually Ryan Pamplin says he is not a robot but he is a successful member of the Silicon Valley tech community and was an actual user of the first Meta product. He may be the most excited person I know about Meta. Well he works there you say? Well Ryan actually left his position at his previous company to join Meta because he is so passionate about their vision.
You can check out my full demo of Meta 2 here on my MettaVR video page.
http://www.mettavr.com/v/ijr38s2yabZ6tKTW8
Watch the video and imagine Minority Report and it really is like that. I just realized I am a minority reporting about tech that reminds me of Minority Report that is called Meta and I am posting videos on a site called Metta, perhaps I should really look into making a dev team for this. Someone get me $949 so I can get a dev unit. Check out their page on how to get a Meta 2 unit.
One last thing I forgot to mention it's also Dogpattable as well. Also still not sure if Ryan Pamplin is a time traveler or a robot.