This Week's Sponsor:

Listen Later

Listen to Articles as Podcasts


Posts tagged with "camera"

Clips 3.0 Brings New Video Aspect Ratios and an Upgraded iPad Experience

Clips is finally growing up.

Since its debut in 2017, Clips has offered a variety of fun, easy to use tools for making or editing videos, but it has always been hamstrung by one severe limitation: you could only create videos with a square aspect ratio. I’ve always found Clips far more accessible than other video creation apps, such as Apple’s iMovie, because it was designed from the ground up for mobile. The inability to create standard widescreen videos, however, or even portrait videos for Instagram stories, was a dealbreaker. As a result, my use of Clips has only been intermittent; despite liking the app a lot, I’ve only ever created two legitimate projects with it.

My use is sure to increase now. With today’s 3.0 update for Clips, Apple is finally eliminating the app’s requirement for square video while simultaneously making its iPad experience far better than before. Clips 3 is what the original app should have been, and it might just mean Apple finally has the ingredients for a hit.

Read more


Halide Mark II Review: The Convenience of Computational Photography and Flexibility of RAW in an Elegant Camera App

iPhone photography has come a long way in the past 13 years. The original iPhone had a 2 MP camera that produced images that were 1600 x 1200 pixels. Today, the wide-angle camera on an iPhone 12 Pro has a 12 MP camera that can take shots that are 4032 x 3024 pixels.

Hardware advancements have played a big role in iPhone photography, but so has software. The size of an iPhone and physics limit hardware advances, resulting in diminishing returns year-over-year. Consequently, Apple and other mobile phone makers have turned to computational photography to bring the power of modern SoCs to bear, improving the quality of images produced by iPhones with software.

Computational photography has advanced rapidly, pushed forward by the increasingly powerful chips that power our iPhones. Every time you take a photo with your iPhone, it’s actually taking several, stitching them together, using AI to compute adjustments to make the image look better, and presenting you with a final product. The process feels instantaneous, but it’s the result of many steps that begin even before you press the shutter button.

However, the simplicity and efficiency of computational photography come with a tradeoff. That pipeline from the point you press the Camera app’s shutter button until you see the image you took involves a long series of steps. In turn, each of those steps involves a series of judgment calls and the application of someone else’s taste about how the photo should look.

Apple has made great strides in computational photography in recent years, but it also means someone else's taste is being applied to your images. Source: Apple.

Apple has made great strides in computational photography in recent years, but it also means someone else’s taste is being applied to your images. Source: Apple.

In many circumstances, the editorial choices made by the Camera app result in great photos, but not always, and the trouble is, your ability to tweak the images you take in compressed file formats is limited. A more flexible alternative is to shoot in a RAW file format that preserves more data, allowing for a greater range in editing options, but often, the friction of editing RAW images isn’t worth it. The Camera app is good enough most of the time, so we tolerate the shots that don’t look great.

However, what if you could have the best of both worlds? What if you could capture a lightweight, automatically-adjusted photo and an editing-friendly RAW image at the same time, allowing you to pick the right one for each image you take? If you like the JPEG or HEIC image produced by Apple’s computational photography workflow, you could keep it, but you could always fall back to the RAW version if you want more editing latitude. That way, you could rely on the editorial choices baked into iOS where you like the results but retain control for those times when you don’t like them.

That’s what Halide Mark II by Lux sets out to accomplish. Halide is a MacStories favorite that we’ve covered many times in the past, but Mark II is something special. The latest update is an ambitious reimagining of what was already a premier camera app, building on what came before but with a simpler and easier to learn UI. Halide Mark II puts more control than ever into the hands of photographers, while also making it easy to achieve beautiful results with minimal effort. Halide also seeks to educate through a combination of design and upcoming in-app photography lessons.

By and large, Halide succeeds. Photography is a notoriously jargon-heavy, complex area. It’s still possible to get bogged down, fretting over which settings are best in what circumstances. However, Halide provides the most effective bridge from point-and-shoot photography to something far more sophisticated than any camera app I’ve used. The result is a camera app that gives iPhone photographers control over the images they shoot in an app that’s a pleasure to use and encourages them to learn more and grow as a photographer.

Read more


Austin Mann’s iPhone 12 Pro Camera Review

Pro travel photographer Austin Mann has put the iPhone 12 Pro through its paces in Glacier National Park, Montana. You won’t want to miss the full selection of images shot by Mann who concentrated his testing on the low-light performance of the iPhone 12 Pro’s improved Wide lens, the addition of Night mode to the Ultra Wide lens, Smart HDR 3, and low-light portrait mode photos that take advantage of the Pro’s new LiDAR sensor for autofocus.

Mann also delves into the Camera app’s settings to reveal new controls provided to photographers. My favorite is the setting that allows the exposure settings to be preserved between shots. As Mann explains:

I’m absolutely thrilled about this new (and hardly talked about) feature. We now have true exposure adjustment that doesn’t revert back to auto mode every time a pictured is captured.

This adjustment remains even when you switch between .5x, 1x, and 2x lenses, or when switching modes. Even if you lock your iPhone and come back to it later, it still remembers your exposure settings. This is much more like working with a traditional manual camera and I love it.

Mann’s post includes other compelling additions to the Camera app’s settings as well as beautiful shots that do a fantastic job of demonstrating this year’s camera advances. What I’m most I’m most eager to see, though, what Mann thinks of the iPhone 12 Pro Max, which has what promises to be significantly better hardware than the iPhone 12 Pro.

Permalink

Understanding the Camera Enhancements Coming to the iPhone 12

Apple announced a lot of improvements to the cameras in its upcoming iPhone 12 line. The enhancements are a combination of hardware and software that look promising on paper, especially when it comes to the iPhone 12 Pro Max.

As with past iPhone announcements, Sebastiaan de With, part of the team behind the camera apps Halide and Spectre, breaks down the changes, explaining what they mean for photographers. Probably the most significant hardware change is to the iPhone 12 Pro Max’s sensor. According to de With:

In addition to a better lens, the 12 Pro Max has the room to pack a new, 47% larger sensor. That means bigger pixels, and bigger pixels that capture more light simply means better photos. More detail in the day, more light at night. That combines with the lens to result in almost twice as much light captured: Apple claims an 87% improvement in light capture from the 11 Pro. That’s huge.

Also, the Pro Max gains a new image stabilization system:

But that’s not its only trick: the 12 Pro Max’s Wide system also gets a new sensor-shift OIS system. OIS, or Optical Image Stabilization, lets your iPhone move the camera around a bit to compensate for your decidedly unsteady human trembly hands. That results in smoother video captures and sharp shots at night, when the iPhone has to take in light over a longer amount of time.

On the software side, de With thinks Apple’s announcement of the new ProRAW image format looks promising, though it’s short on details at the moment and isn’t coming until later this year:

ProRAW, according to Apple, gives you the standard RAW along with this pipeline information, which should offer some fantastic flexibility when editing. Note that this might be a custom format; little is known, and it seems it was only announced and might be limited to the iPhone 12 Pro.

That’s just the tip of the iceberg when it comes to the camera announcements made earlier this week, so be sure to read deWith’s article for additional details.

I also recommend watching Tyler Stalman’s video about the new cameras. Stalman has been watching the evolution of Apple’s cameras carefully for a long time now, and his video is a terrific overview from a photographer’s perspective of what’s coming next.


Eve Cam: An Excellent Addition to a HomeKit Secure Video Setup

Last week I added a new camera to my HomeKit Secure Video setup: the Eve Cam. Announced at CES this year, what drew me to the camera was its slim profile and HomeKit Secure Video support. I’ve used other Eve home automation products in the past and had high hopes that the Eve Cam would be just as easy to install, and as reliable as the electrical outlets and door sensors I’ve tried. So far, I haven’t been disappointed.

Read more


Epic Games Releases iPhone App That Captures Facial Expressions to Unreal Engine

Source: Epic Games.

Source: Epic Games.

Epic Games has released a new iPhone app for videogame developers that captures facial expressions, piping them into the company’s Unreal Engine in real-time. As explained on the Unreal Engine blog:

Live Link Face streams high-quality facial animation in real-time from your iPhone directly onto characters in Unreal Engine. The app’s tracking leverages Apple’s ARKit and the iPhone’s TrueDepth front-facing camera to interactively track a performer’s face, transmitting this data directly to Unreal Engine via Live Link over a network.

What I find most interesting about Live Link Face is that Epic says it scales from solo developers working at home to sophisticated stage productions involving actors in motion capture suits and multiple iPhones. If so, that will make the app a terrific example of the sort of democratization of complex tools that technologies like ARKit and hardware like the iPhone’s TrueDepth camera make possible when integrated into existing workflows.

Permalink

Halide Team Experiments with iPad Pro’s LiDAR Scanner

Source: Halide Blog

Source: Halide Blog

Sebastiaan de With, on the Halide blog, goes deep on the 2020 iPad Pro’s camera module. His examination reveals that the device’s wide camera is virtually identical to that of the 2018 model. And the ultra-wide camera, unfortunately, isn’t quite up to the quality level of what’s found in the iPhone 11 and 11 Pro.

The most exciting and impressive aspect of the camera system is the LiDAR Scanner. The Halide team actually went to the trouble of building an entire proof of concept app that utilizes the LiDAR Scanner to capture your surroundings.

With Halide, we’d love to use the depth data in interesting ways, even if it’s low resolution. There was only one problem: there are no APIs for us as developers to use to get access to the underlying depth data. They only expose the processed 3D surface.

What if we re-thought photographic capture, though? We built a proof-of-concept we’re calling Esper.

Esper experiments with realtime 3D capture using the cameras and LIDAR sensor at room scale. It’s a fun and useful way to capture a space.

I always love reading de With’s in-depth explanations and comparisons of new iPhone or iPad cameras, and this was an especially fun one.

Permalink

Apple Announces Winners of Its ‘Shot on iPhone’ Night Mode Challenge

Photograper: Konstantin Chalabov (Moscow, Russia), iPhone 11 Pro

Photograper: Konstantin Chalabov (Moscow, Russia), iPhone 11 Pro

Apple has announced the six winners of its Shot on iPhone challenge. The contest, which was announced at the beginning of the year, asked photographers to submit their best Night mode shots taken with the iPhone 11 Pro and iPhone 11 Pro Max.

The winning photos, which were taken by photographers from China, India, Russia, and Spain, were judged by a panel of professional photographers and Apple executives and employees. The photos are currently being featured on apple.com, Apple’s Instagram account, and will appear on billboards worldwide in the future.

The images chosen by Apple’s panel of judges are fantastic. It’s remarkable what can be accomplished with Night mode, especially when you look back at what nighttime photography was like on the iPhone just a few years ago.

Don’t miss all six winning Night mode shots in Apple’s press release.

Permalink

First Look: RTRO by Moment Vintage Video Camera App

Source: Moment.

Source: Moment.

RTRO by Moment is a brand new vintage video camera app for iOS from the makers of my favorite add-on camera lenses for the iPhone and the excellent Moment Pro Camera app.

The app is a new direction for Moment. The company’s Pro Camera app, combined with its add-on lenses for the iPhone, push the boundaries of what’s possible with the iPhone’s camera. Packed with settings and customizations, the Pro Camera app can create stunning photos and video in the hands of a skilled photographer.

In contrast, RTRO is a video-only camera app focused first and foremost on making fun, short videos for sharing that use filters crafted by photographers to create unique retro looks. It’s those filters, which Moment calls ‘looks,’ paired with a simple, approachable interface that make the app work. It’s easy to get started, fun to use, and the videos the app creates have a unique vibe that makes even the most mundane video more interesting for viewers.

Read more