This Week's Sponsor:

Collections Database

A Powerful Database with iCloud Sync


Posts tagged with "camera"

CES: A Tour of the Most Interesting (and Strange) Tech Announcements

CES has been going strong all week with announcements of new gadgets: home automation gear, TVs, computers, and lots more. Many mobile phone makers and some big industry players sit out CES, but there is still plenty of news from companies big and small with new products and technologies to show off.

A lot of what gets hyped at CES is prototypes and concept devices that will never ship or will get delayed. Still, every year I find that CES is fascinating to study for the industry trends it reveals and the handful of gadgets I discover that I’d like to try.

After combing through hundreds of headlines and press releases, I’ve compiled a roundup of some of this week’s most compelling announcements. Feel free to skip around to the categories that you find most interesting using the table of contents below.

Read more


Apple Announces Night Mode Photography Contest

About this time last year, Apple announced its first-ever ‘Shot on iPhone’ photography challenge judged by a panel of professional photographers and Apple employees. Apple is back with a new contest app this year asking users to submit their Night mode photos.

Through January 29th, Apple is taking submissions on Instagram, Twitter, and Weibo. To qualify, post your photos on Instagram or Twitter with the hashtag #ShotoniPhone and #NightmodeChallenge and Weibo using #ShotoniPhone# and #NightmodeChallenge#.

Five winners will be picked by a panel of judges that include:

plus the following Apple executives and employees:

  • Phil Schiller
  • Kaiann Drance
  • Brooks Kraft
  • Jon McCormack
  • Are Duplessis

The five winning photos will be announced on March 4th on the Apple Newsroom. Apple says the images may also be used in digital campaigns, at stores, on billboards, and in photo exhibitions.

Night mode photography was a big part of Federico’s story on iPhone 11 Pro photography called Eternal City, Modern Photography: The iPhone 11 Pro in Rome. Here’s an outtake from that story that Federico submitted for the challenge:

For more on the contest and tips on shooting Night mode photos, check out Apple’s press release.


Hands-On with Clips 2.1: Memoji and Animoji Support, Plus New Sticker Face Tracking and More

Apple has released the first big update in over a year for its Clips video creation tool. Following the trend begun in iOS 12, which added Animoji support to FaceTime, now all Animoji and Memoji characters can also be used inside Clips. Though I would have expected such an update a year ago, it’s nevertheless good to see. Besides Animoji and Memoji, Clips 2.1 only adds a couple other small new features, like a fresh batch of Mickey and Minnie stickers, a ‘Let It Snow’ winter poster, and support for right-to-left languages. After spending some time with the update, there are a couple nice implementation details related to Animoji that deserve highlighting.

Read more


Sebastiaan de With Explains Why the iPhone 11 Camera Is Such a Big Leap Forward

Sebastiaan de With, part of the team behind the camera app Halide has published part 1 of a multi-part breakdown of the iPhone 11 camera. It’s a fantastic analysis of what makes the new camera different from past versions and goes into great depth while remaining accessible, even if you have only a passing familiarity with photography.

To put this year’s camera into perspective, de With recaps what Apple did with last year’s iPhone cameras explaining how Smart HDR works and its shortcomings. The iPhone 11 features Smart HDR too, but as de With explains, Apple has significantly improved how it handles the dynamic range of an image.

Another aspect of the improvement is in the camera sensor hardware. Despite its diminutive size, the iPhone 11’s image sensor can resolve more detail than any iPhone camera before it.

However, many of the iPhone 11’s camera improvements come down to better software. The new camera post-processes each component of an image differently, applying different noise reduction to the sky, a face, hair, and clothing, for example. Apple calls the feature Photo Segmentation, and it’s aided by machine learning.

One of my favorite features of the new camera is Night Mode. As de With notes:

In the iPhone 11 Night Mode, you can also see detail vanish in some areas. Except that it really seems to only affect parts of the image that you don’t really care that much about. Night Mode has a remarkable if not uncanny ability to extract an image that is sometimes even sharper than the regular mode, with strong sharpening and detail retention occurring in areas that are selected by the camera during processing.

The iPhone 11’s camera is also the first one de With thinks rivals standalone cameras:

In the past, iPhones made great photos for sharing on social media, but blown up on a big screen, the shots didn’t hold up. It’s why I frequently still pack a ‘big’ camera with me on trips.

With these huge improvements in processing, the iPhone 11 is the first iPhone that legitimately challenges a dedicated camera.

There are many more details in de With’s article, including a close look at the iPhone 11’s ultra wide lens. Every section of the post has photos and side-by-side comparisons that illustrate the analysis too, which makes the full post a must-read].

Permalink

Apple’s Deep Fusion Camera Feature Launching as Part of the iOS Developer Beta Program

According to TechCrunch’s Matthew Panzarino, Apple will roll out the Deep Fusion camera feature announced at the company’s fall iPhone event today as part of the iOS developer beta program.

Deep Fusion is Apple’s new method of combining several images exposures at the pixel level for enhanced definition and color range beyond what is possible with traditional HDR techniques. Panzarino explains how Deep Fusion works:

The camera shoots a ‘short’ frame, at a negative EV value. Basically a slightly darker image than you’d like, and pulls sharpness from this frame. It then shoots 3 regular EV0 photos and a ‘long’ EV+ frame, registers alignment and blends those together.

This produces two 12MP photos – 24MP worth of data – which are combined into one 12MP result photo. The combination of the two is done using 4 separate neural networks which take into account the noise characteristics of Apple’s camera sensors as well as the subject matter in the image.

Apple told Panzarino that the technique “results in better skin transitions, better clothing detail and better crispness at the edges of moving subjects.”

There is no button or switch to turn Deep Fusion on. Like the over-crop feature that uses the ultra wide lens to allow photo reframing after the fact, Deep Fusion is engaged automatically depending on the camera lens used and light characteristics of the shot being taken. Panzarino also notes that Deep Fusion, which is only available for iPhones that use the A13 processor, does not work when the over-crop feature is turned on.

I’ve been curious about Deep Fusion since it was announced. It’s remarkable that photography has become as much about machine learning as it is about the physics of light and lenses. Deep Fusion is also the sort of feature that can’t be demonstrated well onstage, so I’m eager to get my hands on the beta and try it myself.

Permalink

Halide 1.14 Adds New Lens Switching Interface and Guides

Halide 1.14 is out with a new lens switching UI to accommodate the three-camera system of the iPhone 11 Pro and Pro Max. As soon as the update was out, I went for a walk to give it a try.

Halide has introduced a new lens switching button featuring haptic feedback and a dial-like system for moving among the iPhone’s lenses. When you press down on the lens button, you get a tap of haptic feedback to let you know without looking that the lens picker has been engaged.

From there, you can slide your finger among the ultra wide, wide, and telephoto options that radiate out from the button. As you swipe your finger across each option, it enlarges, and you’re met with another little bit of haptic feedback as you swipe over the lenses other than the one already selected. Once you have the lens you want, you simply let go and your iPhone switches to it.

You can also cycle through the lenses in order by tapping the button repeatedly or swipe left for the ultra wide lens or up for the telephoto one. In my brief tests, swiping left or up is the best option if you already know the lens you want, but using the dial-like lens switcher is perfect for considering your options first because Halide has also added lens preview guides.

With the lens button engaged, Halide shows guides for each of your zoom options. That means if you’re using the ultra-wide lens, you’ll see the light gray guidelines for the wide and telephoto lenses. As you swipe over those lenses, the guides change to yellow to highlight the composition you’ll get if you switch to that lens.

If you’re already using the telephoto lens though, Halide will highlight the outer frame of the image to suggest you’ll get a wider shot, though it does not zoom the viewfinder out to show that composition until you lift your finger. You can see how the lens guides work from the screenshots I took at a local high school football field above and in this video:

Switching lenses in Halide.Replay

When you switch to the ultra wide lens, you’ll notice that not all the usual Halide features are available. Manual focus is missing and so is shooting in RAW. That’s because the new iPhone hardware and iOS and iPadOS 13 don’t support those features. Although the ultra wide shots don’t support RAW, Halide has included a ‘MAX’ option in place of the ‘RAW’ option, so you can get the most image data possible from your wide shots, which you can see in the screenshots below.

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

Ultra wide images are limited to MAX quality (left) instead of RAW, which is supported by the wide and telephoto lenses (right).

The Halide team says that the latest update also includes noise-reduction adjustments to the RAW images produced by the iPhone 11, but that they are continuing to fine-tune how that app handles RAW photos from the new phones as part of a more significant update that is coming next.

The latest update is relatively small, but I especially like the use of haptic feedback and lens guides, which make it easy to switch lenses when you’re focused on the viewfinder of the camera instead of Halide’s buttons.

Halide is available on the App Store for $5.99.


Austin Mann on the iPhone 11 and 11 Pro Cameras

Source: austinmann.com

Source: austinmann.com

Every year I look forward to Austin Mann taking the latest iPhones through their paces somewhere in the world. This year, Mann is on tour with cellist Yo-Yo Ma in China where he went out into the countryside to capture some stunning portraits and landscapes.

Mann’s review covers the new Ultra Wide lens, Night Mode, Smart HDR improvements, and ability to capture outside the frame, along with wishes for additional improvements. Mann’s take on Night Mode:

As long as I can remember, the top question I’ve received from iPhone photographers, beginners and pros alike, is How can I shoot better pictures in low light? This year’s addition of Night mode is the answer to the question. It’s easy to use, crazy powerful, and because it’s automatic it will completely change how everyone shoots on their iPhone.

Mann confirms what seemed to be the case from the photos that Apple showed off last week at its event in Cupertino – Apple has implemented Night Mode in a way that doesn’t try to turn night into day:

One thing I love about Apple’s approach to Night mode is the strategic balance of solving a technical problem while also caring deeply about artistic expression. When you look at the image above, it’s clear their team didn’t take the let’s-make-night-look-like-day approach, as some of their competitors have. Instead, it feels more like an embrace of what it actually is (night) while asking, “How do we capture the feel of this scene in a beautiful way?”

How Apple accomplishes Night Mode is interesting. As Mann explains:

From what I understand, the way Night mode actually works is the camera captures a bunch of short exposures and slightly longer exposures, checks them for sharpness, throws out the bad ones and blends the good ones. On a traditional dSLR/mirrorless camera, a 5 second exposure is one single, continuous recording of the light throughout the duration of the shutter so any movement (of subject or camera) is recorded.

But with iPhone 11 Pro the rules are different… it’s not capturing one single continuous frame but blending a whole bunch of shots with variable lengths (some shorter exposures to freeze motion and longer shots to expose the shadows.) This means the subject can actually move during your exposure but still remain sharp.

If you’ve been wondering about the new Ultra Wide camera on the new iPhones or the other new features of the camera app, be sure to check out Austin Mann’s full review for great technical and artistic insights about what Apple has accomplished with its new cameras as well as some absolutely fantastic examples of what they can do.

Permalink

Spectre: A Computational Approach to Long-Exposure iPhone Photography

Spectre is a new specialized camera app from the team that created Halide, one of our favorite camera apps on iOS. The Halide team describes Spectre as a computational shutter for the iPhone, which allows the app to do things like remove people from a crowded scene, create artistic images of rushing water, and produce light trails at night. The same sort of images can be created using traditional cameras, but getting the exposure right, holding the camera absolutely still, and accounting for other factors make them difficult to get right. With Spectre, artificial intelligence is used to simplify the process and make long-exposure photography accessible to anyone with an iPhone.

Read more


Apple’s Clips Introduces New Selfie Scenes, Filters, and More

On launch day for the new iPad Pros, which feature the iPhone’s TrueDepth camera system for the first time, Apple has upgraded its Clips video app with new features that take advantage of TrueDepth’s power. Today’s update also brings new camera filters, posters, stickers, and soundtrack options.

If you have an iPhone X, XS, or XR, or one of the new iPad Pros, the highlight of this release is a batch of six new Selfie Scenes. Last year when the iPhone X launched, Clips debuted Selfie Scenes as a fun and impressive way to utilize the new device’s TrueDepth camera system. Selfie Scenes isolate you from your environment and replace your surroundings with interesting digital backgrounds, such as the Millennium Falcon from Star Wars, or an animated world. The new scenes added in today’s update are called Clouds, High Noon, Big Backyard, Monster Lab, Animal Forest, and Municiberg Mayhem, a scene from The Incredibles 2. They’re all a lot of fun, providing different moods through sound effects and music. Apple says that Selfie Scenes perform better than ever on recent devices thanks to the A12 Bionic’s Neural Engine, and in my tests I did notice that the scenes were smoother and more responsive than before.

Filters and soundtracks are Clips’ next most substantial upgrades. There are three new filters: Comic Mono, Watercolor Mono, and Aged Film. Of these, the latter is easily my favorite, as the first two are only monochrome versions of existing filters. On the soundtrack side, there are a whopping 17 new tunes to choose from for your videos.

Every major Clips update adds a host of new posters, stickers, and text labels, and today’s is no exception. Continuing Apple’s partnership with Disney, there are poster options from Coco and The Incredibles 2, as well as designs related to sports, science, and more.

Though the app’s development cycle has slowed, Apple continues to plug away making Clips a great tool for short video creation. My biggest wish for the app – non-square video – has still gone unfulfilled, but hopefully one day we’ll get that change. Until then, the Selfie Scenes are a fun demo of the newest iPhones and iPad Pros, and Clips continues to be the most enjoyable video creation tool I’ve ever used. I think Apple’s on to something here, but the square video restriction continues to hold Clips back.