Mac owners, this is for you. Practical Mac explores Apple's new software offerings, hardware upgrades and more. Appears every other Saturday.
What’s a Mac or iOS technology that you initially got excited about and then rarely used?
A few — ones I’ve surely trumpeted in this space — come to mind. When the Apple Watch was announced, the “digital touch” features seemed really cool. You could virtually tap or transmit your heartbeat to another Apple Watch owner using the devices’ haptic feedback motor, or send a quick hand-drawn doodle.
Turns out I used it only when I was showing the feature to someone who wanted to see what the watch could do.
Or what about using the Split View in OS X El Capitan and later versions, which displays two apps side-by-side on one screen, reducing visual clutter. I honestly forgot it existed on the Mac, even though I use the same feature fairly often on my iPad Pro.
I was thinking about this question recently as I pondered the AR (augmented reality) features built into iOS. It’s truly amazing technology: Using the camera on an iPhone or iPad, the system determines where a flat surface is, such as a floor or tabletop. It then understands that space in relation to the device, using its accelerometer and other sensors, enabling it to place virtual objects within real space, in real time.
Some retailers have jumped onto AR. Want to see what a specific chair looks like in an empty corner of your apartment? The Ikea Place app and others can visualize it for you.
The New York Times created an Olympics feature where, if you’re reading the article on an iPhone or iPad, you can view a succession of USA Olympic athletes hover in your living room, and then walk around them to learn details about their sport or their skills. My daughter happily slid along the floor with an iPad, looking up so that Nathan Chen’s ice skates were a few virtual inches above her head (because he lifts himself 20 inches off the ice when he jumps).
And there are games, of course. Several friends of mine are big fans of Pokemon Go, and the latest version does AR well, placing creatures in one’s virtual environment in a more realistic fashion than the initial versions did. You can make dinosaurs charge around your neighborhood.
AR makes for amazing demos, but after the initial rush of trying it out, what sticks? Are people using it?
Perhaps this is just my own bias — maybe I’m just supremely lazy — but often when I’m playing a game on my iPhone or iPad, I’m content to be in one place, usually sitting, and relaxing.
The game “The Machines” is really impressive: You choose a flat surface, like a tabletop, and it renders a battlefield where you direct teams of robots to capture your opponent’s base. On your device’s screen, it looks like the battlefield is really there set up on the table, and you can walk around to get different views, peer over obstructions, and crouch really close to see what’s happening in detail. It’s more fun when you have another person as the opponent, because they share the same battlefield on their own device, both of you circling the area and directing your forces.
But that also means I’m moving all over my living room. I need to clear off what’s on the table and make sure there’s space for me to maneuver. On any given day, there’s far less friction in playing something that also allows me to hang out in one place.
So I’m having to remind myself that these apps exist on my already cluttered iPhone.
And yet, the AR technology Apple has built into the heart of its mobile operating system isn’t just a passing fancy or a way to entice customers with whizzy demos. The latest public betas of an upcoming iOS 11.3 release includes ARKit 1.5, the framework for developers to implement AR in their apps. This version recognizes vertical surfaces (like walls) and maps non-rectangular surfaces (like round tables). That takes serious and expensive engineering work.
The thing about AR and other technologies Apple throws in is that they’re often scaffolding for other features down the line. The sensors and recognition technology used by AR is shared by Face ID on the iPhone X. And AR, it seems, is leading to some sort of heads-up display. Holding an iPhone or iPad in front of you for long stretches of time while moving around is just too awkward, but having it projected in your field of vision, hands-free, would be workable. However, an effective pair of smart glasses, or some other solution, doesn’t seem like it will arrive in the mainstream for several years.
I’d love to be proven wrong.