The other day, a colleague and I were defending our thoughts on the most influential and disruptive technologies of this century. The usual suspects came and went very quickly and with little debate. TIVO … duh. Touch screens … well it was actually invented in 1965 but it really became a staple of consumer devices in this century so maybe. IPhone… well revolutionary yes, however, it was derivative of many personal data assistants that came before it. It laid the foundation for the next generation of telephony and basically killed the non-Smartphone market. Ask Motorola, Ericsson and Nokia, if they think it was disruptive.
If the Smartphone killed the mobile phone, then what will kill the Smartphone? What is the next round of disruption to this industry? Why exactly is the Smartphone dead?
To develop disruption, the first step is to identify the pain points within the existing process. For telephony it’s actually pretty easy.
- Smartphones need to have big screen sizes. Practically, there is only so far a screen can grow and still be portable. Current phone sizes can range up past 5 inches; however, no one thinks a 7 inch screen is practical for daily communication purposes. However, increasingly the gap between tablet, Smartphone and computer is shrinking. We must find a way to execute more with less space or fundamentally change the equation.
- Touch screens aren’t that practical. Finger prints drive us all crazy. Screen protectors are just painful and steal the vibrancy away from today’s high res screens. Cursor positioning on a Smartphone is a labor in futility and don’t get me started on why iOS won’t add the cursor arrows to their keyboards. Touch screens also require you to be looking down or at them to engage. How many times have you almost walked into a pole, while texting or surfing on your phone?
- Does anyone like earpieces? Both Bluetooth and wired have fundamental issues and limitations. Tangled cords, poor sound transmission, poor amplification, noise cancellation that baffles both the user and party on the other line. Do you walk around all day with an ear piece on or put it in your pocket until needed. You need to charge it with a separate charger than your phone (generally) and how often does either your phone or earpiece run out of power when you need it most? The underlying problem is that flat, rectangular phones are a flawed shape for telephony. Great for data, poor for conducting a conversation.
- Where do you put your phone when you aren’t using it? Nomophobia is the fear of losing your phone. We can all agree that it is a well founded fear and most of us have misplaced, lost or dropped a phone. With phone prices rising, it’s easy to understand how this has become a condition. The bigger phones get, coupled with the more power needed to support the larger and higher resolution screens, the harder it is to comfortably walk around with a phone in your pocket. Form and function are again at odds.
So what does my crystal ball say as to the future evolution of personal computing, communication and telephony. It actually takes a page directly out of Apple’s own playbook. Identify a form factor that can be stylish, trendsetting and leverages existing capabilities in a way that provides a differential experience. Take technologies that are already there and combine them in a way that hasn’t been thought of yet.
Welcome to, iGlasses 2018. Not the currently interpreted view of what Google’s Project Glass and Apple’s existing iGlasses initiative represent but the real world and disruptive application of that vision. Imagine a set of eye glasses, with thousands of frame choices for you to make, that integrates hard- wired ear buds (think of a better version of what Oakley makes) and battery and charging systems to make them a completely integrated system. Utilize induction to charge the whole thing up and slim connectors, Bluetooth or wireless to upload and download data.
Like the existing Google Project Glass and Apple iGlasses initiatives these transparent and head mounted displays would replace traditional lenses in the frames permitting the wearer to view and interact with content through a much, much larger perspective. Imagine how you would interact with a 50 inch screen in front of you? Don’t worry about walking into a pole because you can see right through the data and you have your head up all the time.
Now how would you interact with the data? On a Smartphone you touch, tap and drag. Using iGlasses, you could use four different data input methods.
- Voice. Clearly Siri and voice technology is only improving. It won’t be long for you to be able to guide your experience through voice alone. Siri, pull up today’s calendar. Siri, call my brother. That works already, however it would become cumbersome on its own for repetitive navigation commands.
- Eye tracking. The frames of the glasses could have sensors that track the movements of your eye to reference where on the virtual screen you are looking at. Currently, there are dozens of practical applications using eye tracking as a computer interface. It simply needs to be miniaturized to fit this application.
- Hand/finger tracking. Kinect and other gesture control technologies are exploding right now. Unlike eye tracking, imagine the sensor is on the other side of the rim tracking your hand/finger movements to position the cursor on the virtual screen. Look anywhere and virtually type in the air to compose your next email.
- Lastly, and certainly not as far off as you think is mind control. No really! Check out this TED video to see how a small number of strategically positioned electrodes can enable anyone to move cursors and 3D renderings with their mind. Imagine that the eyeglass’ rim, temple and earpiece have sensors that detect your unique brainwave pattern. Instead of reading how to click and swipe in the instruction manual, it’ll teach you how to calibrate the sensors to read your thoughts.
Am I serious? Absolutely! Is it disruptive? You tell me. While you think about, it, I’m placing my advanced order.