Popular Google+ photos have been available via Google's Wallpapers app on Android and on Google Fiber and Chromecast devices, but today, Google is bringing them to macOS too. Google’s Featured Photos Screensaver rotates through a selection of high-resolution photographs that have been publicly shared on Google+ and don’t include people in them. Each photo also includes information about the photographer that took the shot and links to more of their work. If you’re a photographer and want your photos to be considered for inclusion in the app, you can learn more here.
Posts tagged with "google"
Blending a variety of national and local news with article recommendations based on your personal interests using machine learning, Newsstand creates a media-rich 'For You' page divided into two sections. The 'Briefing' includes a handful of what Google deems the most important and relevant stories to you. Below the Briefing is 'Highlights,' a longer list of articles culled from you favorite sources and topics. Each article in Highlights helpfully explains why it was suggested.
Tapping the three dot menu button below any article lets you hide stories from its source, have fewer articles of that type suggested (I took advantage of this immediately with CNN's report on a Parmesan cheese recall), or jump directly to the source or topic of the article. I've found the last two options a great way to quickly build a database of topics and sources that I want to follow.
Newsstand is built on an AMP foundation:
We have improved our support for multimedia content building on the AMP support we launched earlier this year. Scroll through your feed, and you will see autoplay videos, easy podcast controls, and high-resolution, full-bleed images. Every story and topic in Newsstand now comes to life in a more engaging, beautiful presentation.
I'm not a fan of autoplay anything, but Newstand's articles look terrific and load fast.
Finally, Google also touts Newsstand's new web app as a way to access news wherever you are. It’s broad claim that needs to be qualified. The unstated assumption seems to be that the web app is for desktop use only because it doesn't work on iOS even if you use Google's Chrome browser. Moreover, on macOS, Newsstand doesn’t work with Safari, instead directing you to download Chrome.
In some ways Google Play Newsstand feels like a modern implementation of Google Reader, which was shuttered in 2013 around the same time that Newsstand was introduced. I wonder how much better my recommendations would be if Newsstand had the benefit of all the years I used Google Reader. Maybe it does have access to that data, but using Newsstand feels too much like starting over for that to be the case. In any event, Google Play Newsstand is a worthy competitor to Apple News. Perhaps 2017 will see competition among news services similar to what we've seen with photo services this year.
Earlier today, Google announced a new app called PhotoScan and updates to Google Photos. PhotoScan is a simple scanner app for capturing prints. On launch it opens to a camera view with the instruction to frame your photo within the rectangle on the screen. When you tap the shutter, instead of taking a picture of your print, PhotoScan displays a circle in the middle of the view finder with four other circles near the corners of your print. There is a little arrow that prompts you to move your phone to line up the middle circle with each of the four other circles one at a time. When you’re finished, PhotoScan processes the data it’s collected and presents you with your scanned print, which can be further adjusted in-app.
PhotoScan, which is a free download on the App Store, doesn’t require you to sign into a Google account, unless you want to save your scans to Google Photos. If you prefer, you can save your scans to Apple’s Photos.
Google made a video demoing how PhotoScan works:
I tried PhotoScan on a handful of family photos to see how it would fare. In my tests, I found that there are a couple of simple things you can do to greatly improve your scans. First, find a spot where the lighting is good, but indirect which will help avoid glare on glossy photos. Second, don’t use the flash. Here’s an example of a scan with and without the flash that I took in the same spot, from the same distance, and with the same lighting:
Not every photo taken with the flash on had this much glare, but most had a bright white spot in the middle of the photos. Here are three scans that came out much better that were taken under normal lighting conditions in my kitchen with the flash turned off:
PhotoScan does have some bugs. It crashed a couple times while I was using it. The second crash happened after I scanned fourteen photos. I went to the preview page to save them and when I tapped ‘Save All,’ PhotoScan crashed. When I reopened the app, all of my scans were gone. I thought I had lost data, but it turns out they were saved to Photos before the app crashed, so what could have been a scary moment if I had scanned dozens of photos turned out fine.
Overall, PhotoScan did a good job detecting the corners of prints and properly cropping most of them. PhotoScan also did a good job capturing the colors and detail of each shot as long as the flash was disabled. None of the snapshots I scanned were in perfect focus, but the scans of each were noticeably fuzzier and the colors off a little in some. Despite the bugs and limitations though, PhotoScan is an app I’ll keep close by when I visit relatives over the holidays for when they pull out family albums of photos because it’s so convenient and easy to use.
Google also added three new features to Google Photos today. The first is an improved auto-enhance tool. Second, Google added twelve new filters, which it calls ‘Looks.’ The feature first edits the photo to enhance it and then applies a filter that complements your photo. How does Google Photos know how to adjust its filters complement your photos? Machine learning of course. The third feature is fine-grained light and color editing tools. The Verge reports that Google is also introducing three new automatically created videos to Google Photos, for newborns, formal occasions like weddings, and a ‘through the years’ a slideshow for annual events like holiday gatherings.
Google is revamping Google Play Music with intelligence that it says will deliver the right music at the right time using machine learning. According to a post by Elias Roman, Lead Product Manager for Google Play Music, Google’s streaming music service will go beyond just figuring out what you like from the music you listen to. The update will also take into account context – things like your location, what you’re doing, and even the weather.
To provide even richer music recommendations based on Google’s understanding of your world, we’ve plugged into the contextual tools that power Google products. When you opt in, we’ll deliver personalized music based on where you are and why you are listening — relaxing at home, powering through at work, commuting, flying, exploring new cities, heading out on the town, and everything in between. Your workout music is front and center as you walk into the gym, a sunset soundtrack appears just as the sky goes pink, and tunes for focusing turn up at the library.
In addition, Google has redesigned the Google Play Music home screen to emphasize your favorite music by putting it right at the top of the screen and adjusting what’s shown based on your context. The service will also automatically create an offline playlist of recently played songs for subscribers to listen to when they have no data connection.
It’s not surprising to see Google take Google Play Music in this direction. One of Google’s biggest competitive advantages is the data it knows about you from its many products. This sort of assistive technology is already baked into products like Google Photos and it seems natural to bring the same smarts to Google Play Music too.
Google Play Music will begin its world-wide roll-out to sixty-two countries this week on iOS, Android, and the web.
Until today, Gmail hadn’t been updated since January 2016. That’s a long time between updates for a major Google service. Meanwhile, Inbox, Google’s other email client, has been repeatedly updated throughout 2016. That led me and others to wonder whether Google was abandoning its more traditional email client for Inbox’s cutting-edge features. Although Gmail appears to be on a slower update cycle than Inbox, today Gmail received a handful of solid new features and a design update that breathes fresh life into the app.
Gmail’s new design brings it much closer to the UI of its Android cousin and other Google apps available on iOS. The design is colorful, uncluttered, and easy to read with a compose button right where you expect it in a Google app.
Google also added the ability to quickly archive or delete messages with left or right swipe gesture from your inbox. By default, Gmail is set to archive when you swipe on a message, but you can switch it to delete messages in settings on a per-account basis.
Gmail’s new ‘Undo Send’ feature gives you about five seconds to tap an Undo button in the lower right hand corner of the screen to retrieve a message that you didn’t mean to send. It would be nice to be able to adjust the length of the delay before a message is sent, but I suspect five seconds is enough in most circumstances.
In addition, Google says it has made email search even faster. The speed and accuracy of Gmail’s search is the primary reason I keep Gmail installed. I don’t use Gmail regularly enough to comment whether the new search is faster, but in my limited testing search results appeared almost instantly on a good WiFi connection, even when retrieving months-old email.
I’m glad to see Google pushing Gmail forward. With this update, it fits better aesthetically with the rest of Google’s iOS apps and adds features that keep Gmail on par with other iOS email clients. I only wish that Gmail also included a unified inbox option instead of requiring me to switch among multiple accounts.
This is a good video by Marques Brownlee on where things stand today between Siri (iOS 10) and the Google Assistant (running Android Nougat on a Google Pixel XL). Three takeaways: Google Assistant is more chatty than old Google Voice Search; Google still seems to have an edge over Siri when it comes to follow-up questions based on topic inference (which Siri also does, but not as well); and, Siri holds up well in most types of questions asked by Brownlee.
In my daily experience, however, Siri still falls short of basic tasks too often (two examples) and deals with questions inconsistently. There is also, I believe, a perception problem with Siri in that Apple fixes obvious Siri shortcomings too slowly or simply isn't prepared for new types of questions – such as asking how the last presidential debate went. In addition, being able to text with Google Assistant in Allo for iOS has reinforced a longstanding wish of mine – the ability to converse silently with a digital assistant. I hope Siri gets some kind of textual mode or iMessage integration in iOS 11.
One note on Brownlee's video: the reason Siri isn't as conversational as Google Assistant is due to the way Brownlee activates Siri. When invoked with the Home button (or by tapping the microphone icon), Siri assumes the user is looking at the screen and provides fewer audio cues, prioritizing visual feedback instead. If Brownlee had opened Siri using "Hey Siri" hands-free activation, Siri would have likely been just as conversational as Google. I prefer Apple's approach here – if I'm holding a phone, it means I can look at the UI, and there's no need to speak detailed results aloud.
Google released a nice update to their iOS keyboard, Gboard, earlier today.
Cursor control can now be activated with 3D Touch, which is consistent with the behavior of Apple's keyboard. Gboard can't move the cursor freely on the screen like the system keyboard, though, which makes it more limited when it comes to swiping across multiple lines of text. Also, Google didn't implement haptic feedback when switching between contextual keyboard menus (such as holding down on the dash key), which is another detail that I appreciate in Apple's keyboard on the iPhone 7.
Similarly, Gboard now features Contacts integration to look up a person's contact card directly from the keyboard – but it's not as tightly integrated as QuickType suggestions in iOS 10. However, I prefer the presentation of contact cards in Gboard and I think Google's is a sweet solution as well.
Gboard is shaping up nicely, but I continue to wish Google paid more attention to the iPad layout and built true multilingual support for international users.
Google Photos has introduced four new features:
- Google Photos uses faces in your most recent photos to suggest older photos with with the same person in them;
- If you take a lot of photos of the same subject, like a child, Google Photos will create a card of the best ones from the past month;
- Animations, which Google Photos already creates using photos, are also generated from videos now; and
- If Google Photos detects that there are sideways photos in your collection, it will present a card with the photos that it thinks should be rotated.
This is what Google Photos does best. It finds connections and photos that would be like searching for a needle in a haystack if you did it manually with a big photo library.
Each of the new features are available on iOS, Android, and the web.
As an Apple user, what grabbed my attention was the multi-platform nature of Google Assistant, the company's take on Siri powered by advanced AI and third-party service integrations. As explained by Jacob Kastrenakes at The Verge, Google Assistant will be available in three places, with some initial differences:
Google is almost certainly going to combine the different forms of its Assistant eventually so that they're all more-or-less equal in terms of features (save for, you know, stuff like screen search on a speaker). But for now, it’ll be occasionally limiting of what you can do.
And if you’re wondering how Google Now factors into all of this, well, join the club. Google says some of the stuff in Now that makes sense to be in an assistant will migrate over there — but how and when that will happen is anyone’s guess.
These inconsistencies may be part of why Google is keeping Assistant exclusive to just a few of its own products for the time being. It's a core feature of those products, but it's still in early stages, with mixed up features and ways to use it.
Obviously, Google Assistant is never going to be as prominent on iOS as it is on Android and other Google hardware, but I hope Google will update their iOS apps to bring Assistant's features to iPhone and iPad users.
Right now, Assistant can be accessed in a conversational form inside Allo, Google's messaging app that I only keep installed because of its Assistant support. The Google app is limited to Google Now (which I enjoy because of its time to leave and package tracking updates), but I'd expect it to gain Assistant in the future, simulating a Siri-like interface like Microsoft did with Cortana on iOS. Or perhaps Google could launch a full-fledged Assistant app on iOS, bundling text interactions (which Siri doesn't have) and voice activation in a single utility. I'm curious to see how (and when) Google brings more of its Assistant to Apple's devices.