As the pandemic recedes across the country, a lot has changed about how we work and live. Some of us are now planning on returning to the office, while others will continue working from home. Many of us will do both.
The software on our phones, our most essential tools in the pandemic, is also evolving for this new reality. The changes are courtesy of Apple and Google, which recently unveiled their latest phone software designed for this Covid-accelerated era of hybrid remote work.
This week, Apple showed iOS 15, its next operating system for iPhones. The software gives device owners new tools to draw boundaries for hybrid work, like a status message that lets others know you are busy before they message you. FaceTime, Apple’s videoconferencing software, will also get its most significant expansion since its debut more than a decade ago. The service will finally be opened to non-Apple devices, including Android phones, and improve such video sessions look and sound better.
Last month, Google introduced Android 12, its latest operating system for mobile devices. The company focused on streamlining the design of its software to help people get their work done more efficiently, including a menu of shortcuts to get to their favorite tools more quickly.
To Carolina Milanesi, a consumer technology analyst for Creative Strategies, the changes in iOS 15 and Android 12 — primarily Apple’s tool for setting digital boundaries — reflect how our lives and work are shifting again as we try to leave the pandemic behind.
“That little prompt that says people don’t want to be disturbed is going to be really important,” she said. “Maybe you’ll prepare your message on the weekend, but you don’t hit send until Monday morning, so that way you don’t push other people to work on a weekend. It takes the pressure away.”
Apple and Google want you to be more productive.
Stuck at home for most of the last year, many of us became glued to our phone screens, chatting on video calls and messaging loved ones amid room scrolling sessions, for some, that made it feel impossible to stay focused on a single task.
In response, Apple’s new software provides iPhone owners with tools to minimize distractions, while Google has added buttons to help people perform tasks on the phone more quickly.
Consider one new feature from Apple, called Focus. Focus can be used to set dividing lines for different parts of your day. You can break up your day into categories like personal life, work, and sleep. You then decide which people and apps can cause notifications to appear on your phone for each of those.
You can set the phone during work so that only your boss’s messages appear as notifications; other notifications will be muted. If you want to stay focused on your personal life, you could allow notifications only from family and friends. You can also set the phone to Do Not Disturb and have a status message like “On deadline” or “At the movies.” People trying to message you will see that status and might consider messaging you later.
Google’s changes are more cosmetic. It redesigned the controls in Android to include large rectangular buttons for easy access to functions such as the flashlight, internet settings, and the audio recorder.
Apple’s FaceTime is getting a Zoom-like upgrade.
Videoconferencing has become a ubiquitous communication medium for office meetings, happy hours, and yoga sessions. Now Apple’s FaceTime is getting a major revamp, with more features and the ability to work with non-Apple devices. Those changes put it more on a par with Zoom, the No. 1 videoconferencing app.
For the first time with iOS 15, FaceTime will be accessible through an internet browser. That means Android and Windows users can use their browsers to video chat with iPhone users in a FaceTime session by clicking on a link.
But the most remarkable new features for FaceTime remain exclusive to Apple users. SharePlay will allow iPhone owners on a FaceTime call to use an app together. If you stream a movie and hit the SharePlay button, the other person on the call will stream the movie simultaneously. If an Android user is on the FaceTime call, though, the SharePlay function won’t work at all.
Apple’s camera software plays catch-up with Google.
For years, Apple and Google have competed for head to head in offering phone cameras that produce excellent photos. But Apple’s camera software has lagged Google’s software, which is powered by artificial intelligence. In iOS 15, Apple is taking steps to make its camera software smarter.
With a feature that Apple calls Live Text, iPhone users can soon do more with the camera’s ability to function as a document scanner. If you snap a picture of a restaurant receipt, for instance, you will be able to use Live Text to tap on the phone number in the photo to call the restaurant. Or, if you point the camera at a tracking label, you can tap on the tracking number to immediately track the package — no typing required.
Apple device owners will also pull up these types of images later with a keyword search. So if you took a photo of a handwritten recipe, you could open your photo album and type the recipe’s name to search for the image. At that point, you could also convert the scribbled notes from the handwritten recipe into text and transfer them to a digital notepad.
The technology powering this is known as optical image recognition, blended with some artificial intelligence. Android’s photo app has had a similar feature, Lens, for about four years.
And Google plays catch-up with Apple on privacy.
Apple has made waves over the last few years with tools to protect user privacy, including a button that allows iPhone owners to ask apps not to track and share their activities with third parties like marketers.
Google, whose revenue relies primarily on serving digital ads, has not responded with a similar prompt for people to easily opt out of tracking. But Android 12 will give people more transparency into the data that apps collect and provide new controls to restrict an app’s access to information.
One tool, which Google calls a privacy dashboard, shows a timeline of apps that have access to different phone parts throughout the day. It could indicate that a social media app tapped your camera at 1 p.m. and that a weather app used your location at 3 p.m.
Google has also added buttons like kill switches to shut off an app’s access to the phone’s microphone and camera. That could be useful in sensitive situations, like when you visit a doctor and make sure an app is not listening in on the conversation.
Apple also said this week that more privacy controls were arriving for iPhones, including a so-called app privacy report that works similarly to Google’s privacy dashboard. In addition to revealing what data is being tapped by each app, the privacy report will show the domains, or web addresses, that the apps are contacting. That could give insight into what companies an app is sharing data with while you use it.