I just attended the 1st US edition of the Collision conference in New Orleans.
A 3 days marathon of start up pitches and product demos, talks by founder, developers, media execs and investors.
I focused my time mostly on keynote sessions to get a good feel for tech and media tends.
Here is what I captured.
I am not going to write about the strategic value of data, the astronomic rise of video, the audience fragmentation push and pull game, the importance of native content or the need to experiment with bots, etc…
All these topics were definitely highly discussed but I’d rather share fresh new insights that I haven’t heard much about before.
There are 2 of them.
1- VR is better with AR and AI:
VR was at the center of many many, many conversations. Every company, every start up seems to be involved in some way or another.
It is like we are recreating the invention of the moving image but instead of doing it over 100 years, it is happening in 3 to 5 years.
The impact of VR on our lives, our social interactions, our empathy is (will be) huge.
From the New York Times Displaced documentary covering the refugee crisis, to Charity water story telling, to Cirque du Soleil working on integrating VR into their shows, or recording important moments of our lives with the new Samsung VR camera being released next month….
VR is the next big thing. That is pretty obvious.
What was interesting at Collision conference was to hear about what will happen when VR is here, at scale.
VR will be bigger with AR and AI (and vice versa)
The potential for VR + AR +AI merged together deepen utility and “enterpisification” in the work place, education and general life experiences…
For instance, imagine a VR experience with voice recognition, space tracking, hand movements and personal data all in one.
You are in a VR experience, you turn your hand up, your emails appear, speak to dictate your response, swipe right and it calls the person you want to communicate with…
As screen definition improves, processing power speeds up, content creation becomes simpler, new human behaviors that we don’t even know could be possible will emerge simply because the VR echo system exists…
This is a good transition to the next insight.
If we are going to create robots and attempt to enhance human genetics and physiological functions, we need to solve ethical questions that didn’t need to be asked before.
I started to think about that at SXSW where I saw a few engineers giving presentations that we were closer to philosophy than technology.
Right now we know 2 stages of life “alive” and “not alive. So where do robots fit in and how we define their missions in society?
What about the fascinating applications that Halo Neuroscience will have not only on sport performances but learning, education, medicine… how can the device be used without aggravating inequalities?
Is it pushing the limits of the human conditions?