The Sparrow is beautiful, and I agree that things we use in our lives should be as beautiful as possible. But, at what expense? No blind employees of a store using the Sparrow can be blind. The Apple-Smoothie interface principal seems to think of touch as only necessary to hold, behold a product.
Even sighted people need tactile clues. Notice your keyboard. It likely has bumps on two of the keys. My Natural keyboard has bumps on the “F” and “J” keys. Why? Did the manufacturing equipment just hiccup? Of course, not. They’re clues I subconsciously use to find the keys.
The Sparrow is just the most recent example of how vision-dependent tech the future is in designers minds. That’s not too surprising since traditionally design has involved vision. That’s too limited, too easy. We need to expand design to consider all the senses.
I’ve been somewhat unfair about the iPhone accessibility. Apple has considered the issue. Like the rest of us, though, Apple considers accessibility as something applied on top of the “real” product. Certainly how to provide the UI enhancements has to be considered in early planning, but it’s not an integral, important part of the design.
Why not? I mean. As long as we’re dreaming.
What would a system be like that could recognize speech…when it’s signed?