This Week's Sponsor:

Kolide

Ensures that if a device isn’t secure it can’t access your apps.  It’s Device Trust for Okta.


Posts tagged with "camera"

The Continuity Camera and ‘Unlock With Apple Watch’ Conspiracy

Continuity Camera is amazing. Since it was introduced in macOS Ventura, I’ve been using the feature almost daily. Continuity Camera is a native feature on macOS that lets you use an iPhone as your webcam. For it to work, you can either connect the iPhone to your Mac using a cable, or use it wirelessly if both devices are signed in with the same Apple ID. It’s quite impressive that, despite having to rely so often on video calls for work, I still don’t own a webcam today. Instead, the camera I use at my desk is an old iPhone SE (2nd generation), which was my partner’s main iPhone until they upgraded last year.

Over the past few months, however, the number of video calls I have needed to take on a daily basis has become critical. As an activist, part of my work now also involves conducting online training sessions with sometimes up to a hundred participants at a time. I just couldn’t afford to join one of those sessions and not have my camera working. Continuity Camera became a feature that I need to work reliably. Sadly, it doesn’t. Half of the time, apps like Zoom and Discord on macOS could not see the iPhone SE in the list of available cameras. This meant I had to fetch a Lightning cable to manually connect the iPhone. If I was unlucky that day, and that didn’t work, I would have to completely reboot the Mac. If I was really unlucky that day, and even that didn’t work, I would end up joining the call without a camera. Despite meeting all the requirements listed by Apple Support, this problem just kept happening on random occasions.

I had to find a fix for this bug, or at least a way to work around it.

Read more


Shot on iPhone: Behind The Scenes of the Scary Fast Apple Event

Jess Weatherbed writing for The Verge:

Behind-the-scenes footage of Apple’s Monday evening Scary Fast eventreveals how it was filmed using an iPhone 15 Pro Max… with the aid of a full suite of professional recording equipment and studio lighting. Still images and a video reveal that (unsurprisingly) a great deal of fancy equipment — from drones, gimbals, dollies, industrial set lighting, and other recording accessories — is still required to make iPhone footage look this good.

The equipment supporting the iPhone 15 Pro Max used to film Apple’s Scary Fast event is extensive and clearly made for a final product that you couldn’t shoot on your own with just an iPhone. However, it’s still impressive to see such a small device at the center of such an elaborate and well-produced event. Originally leaked in a tweet, here’s the official version of the video:

Permalink

The Creativity Enabled by the iPhone 15 Pro’s New Cameras and Action Button

Every year, one of the most anticipated iPhone hardware announcements is changes to its camera. This year, the iPhone Pro Max’s new telephoto lens was the center of attention. However, there were other notable tweaks to the camera hardware and software across the iPhone lineup, too. Plus, we got a hardware bonus in the form of the iPhone 15 Pro and Pro Max’s Action button, which can perform some interesting tricks. Now, with the new iPhones in the hands of people around the world, we’re starting to see what that hardware can do in practice, and I’ve got three examples I’d like to share.

Source: Lux.

Source: Lux.

The first is an update to the camera app Halide that does something incredibly clever. Built into version 2.13 of the app is a shortcut that can be tied to the Action button to open Halide with a single press. That’s something you can do with any app using an Open App action, but Halide goes a step further by offering in-app functionality if you tie the button to its app. In the app’s settings, you can choose to tie the Action button to any of the following options:

  • Do nothing
  • Lenses
  • Exposure Mode
  • Focus Mode
  • RAW
  • Resolution
  • Capture

After using the Action button to open the app, pressing the button again will perform whichever action you picked in its settings. For example, if you chose Lenses, the first press of the Action button from outside the app will open Halide, and subsequent presses will cycle through each of the available camera lenses. I love this use of the Action button and hope other developers do the same, adding contextual actions to more apps.

A 5x telephoto shot by Sebastiaan de With.

A 5x telephoto shot by Sebastiaan de With.

Speaking of Halide, Sebastiaan de With, one of its creators, published a review of the iPhone 15 Pro Max camera today, concluding that:

With iPhone 15 Pro Max’s default 24 megapixel resolution, added ‘lenses’ under the main camera lens, automatic depth capture for portraits, and that 5× lens, this release might not blow away on a spec sheet, but is massive for everyone who uses an iPhone to take photos.

There’s a lot of ground to cover between the hardware and processing changes happening behind the scenes. Plus, de With is an excellent photographer whose shots do a fantastic job illustrating what is possible with the iPhone 15 Pro Max. So be sure to check out the full review.

Finally, the iPhone’s camera takes amazing video, too. This year saw the introduction of Log encoding for Pro Res 4K footage. That opens up a wider range of editing control using apps like Final Cut Pro, which Joey Helms used to create this amazing video of Chicago:

I’ve had my iPhone 15 Pro Max for just four days, and already, I’m enjoying taking photos as I walk around my neighborhood and playing with features like adding Portrait mode after the fact to images like the one below.

Before (left) and after (right) applying Portrait mode.

Before (left) and after (right) applying Portrait mode.

The result is a lot more creative freedom that’s more accessible than ever, not only because your iPhone is usually in your pocket but because the tools Apple has created for taking great photos and videos are so easy to use.


How 3D Pets Uses the iPhone to Create Prostheses for Pets

It’s easy to forget how powerful the computers we carry with us everywhere are. While most of us are firing off text messages to our friends, companies like 3D Pets are using the iPhone’s LiDAR and TrueDepth camera in innovative ways to help dogs and other animals with missing or deformed limbs.

Yesterday, both Apple and Marques Brownlee published videos spotlighting the work 3D Pets is doing to create custom prostheses for pets. The process includes taking a 3D scan of the animal using the iPhone’s TrueDepth camera and then modeling and 3D printing a one-of-a-kind prosthesis.

The tech is cool, and the stories are heartwarming and worth taking a break during your day to watch.


Detail Duo and Detail for Mac: A Modern, Machine Learning-Powered Approach to Video

It’s harder than ever to push Apple devices to their limits. Sure, some apps and workflows will do it, but for everyday tasks, Apple silicon has opened a gap between hardware and software that we haven’t seen in a while.

The transformation was gradual with the iPhone and iPad compared to the sudden leap the Mac took with the M1, but the result is the same. There are fewer and fewer apps that push Apple’s chips to the max.

That’s beginning to change with the focus on machine learning and Apple silicon’s Neural Engine. While pundits fret over Apple’s lack of an AI chatbot, developers are building a new class of apps that use local, on-device machine learning to accomplish some pretty amazing feats on all of Apple’s devices.

Detail Duo.

Detail Duo.

Great examples of this are the apps by Detail, an Amsterdam-based startup. Detail has two apps: Detail Duo, an iPhone and iPad video production app, and Detail for Mac, which does something similar but with a focus on multi-camera setups more suitable to a desktop environment.

As I explained in my Final Cut Pro for iPad first impressions story last week, I don’t work with much video. However, I’ve been dabbling in video more, and I’ve discovered a story as old as personal computers themselves.

Every hardware advance that creates a huge amount of performance headroom is eventually consumed by the ever-growing demands of apps. That’s just as true with Apple silicon as it was for other chip advances. What seemed like more power than average consumers would ever need quickly becomes a necessity as apps like Detail Duo and Detail push that hardware to its limits.

It’s these sorts of advances that I find incredibly exciting because when they’re coupled with intuitive, well-designed apps, they open up entirely new opportunities for users. For Detail, that means simplifying and democratizing video production that would have been out of reach of most users not that long ago, expanding access to video as a creative outlet.

Before digging into these apps further, though, you should know that my son Finn is on the team building Detail and Detail Duo. That’s one of the reasons I’ve known about and followed these apps for a long time now. I figured that’s context readers should know.

Read more


MKBHD on Apple’s Processing Techniques for iPhone Photos

In his latest video, MKBHD eloquently summarized and explained something that I’ve personally felt for the past few years: pictures taken on modern iPhones often look sort-of washed out and samey, like much of the contrast and highlights from real life were lost somewhere along the way during HDR processing, Deep Fusion, or whatever Apple is calling their photography engine these days. From the video (which I’m embedding below), in the part where Marques notes how the iPhone completely ignored a light source that was pointing at one side of his face:

Look at how they completely removed the shadow from half of my face. I am clearly being lit from a source that’s to the side of me, and that’s part of reality. But in the iPhone’s reality you cannot tell, at least from my face, where the light is coming from. Every once in a while you get weird stuff like this, and it all comes back to the fact that it’s software making choices.

That’s precisely the issue here. The iPhone’s camera hardware is outstanding, but how iOS interprets and remixes the data it gets fed from the camera often leads to results that I find…boring and uninspired unless I manually touch them up with edits and effects. I like how Brendon Bigley put it:

Over time though, it’s become more and more evident that the software side of iOS has been mangling what should be great images taken with a great sensor and superbly crafted lenses. To be clear: The RAW files produced by this system in apps like Halide are stunning. But there’s something lost in translation when it comes to the stock Camera app and the ways in which it handles images from every day use.

Don’t miss the comparison shots between the Pixel 7 Pro and iPhone 14 Pro in MKBHD’s video. As an experiment for the next few weeks, I’m going to try what Brendon suggested and use the Rich Contrast photographic style on my iPhone 14 Pro Max.

Permalink

Belkin’s MagSafe Mount for Desktops and Displays, Hand Mirror, and the Logitech Crayon

It’s the end of the year, and before I take a few days off to relax for the holidays, I have a few cool things to share that have been sitting on my desk and Mac for a little bit.

The Belkin Mount with MagSafe for Mac Desktops and Displays

One of macOS Ventura’s flagship features is Continuity Camera, which lets you use an iPhone’s camera as a webcam. I covered Continuity Camera in my Ventura review, and it works really well, especially with Center Stage turned off, so you get the full uncropped image from the iPhone’s camera.

A side view.

A side view.

Alongside Continuity Camera, Belkin introduced an excellent, compact MagSafe mount for Apple Laptops but left desktop and external display users hanging. Last week, desktop users got their wish for a similar solution, with a double-hinged MagSafe Mount that I expect will work on a work with a wide range of displays.

Ready for hooking to a screen.

Ready for hooking to a screen.

Belkin sent me its new mount to try last week, and I immediately gave it a try. The hardware has a nice, solid feel. The hinges are stiff, so your iPhone’s weight won’t affect your setup, and every surface that touches your display, front and back, as well as your iPhone, has a soft-touch finish that shouldn’t scratch your display or phone.

Read more


Austin Mann Reviews the iPhone 14 Pro’s Cameras

Source: Austin Mann

Source: Austin Mann

Austin Mann’s review of the iPhone 14 Pro’s cameras is out, and as usual, he’s back with beautiful photos from an interesting location. This time it’s the Scottish Highlands where Mann put the iPhone Pro’s new cameras to the test.

One of the advantages of the new 48MP camera is more latitude to crop images without reducing their resolution too far. There’s a great example in Mann’s review of a tight crop on a rooster that illustrates how far an image can be cropped and still retain lots of detail. Still, Mann concludes that he’s more likely to shoot at 12MP than 48 in most situations because it’s still the fastest way to shoot and performs so well in low-light. Mann was also impressed with shooting video in Action Mode, although he notes that it requires good lighting and crops the resulting video substantially.

Mann’s bottom line:

With high-resolution imaging capability, Action mode stabilization, and a Cinematic mode that now supports 4K at 24 fps, the iPhone 14 Pro is a powerful imaging tool in the pocket of a creative pro. Beyond the cameras, new safety features like Emergency SOS via satellite and crash detection are exciting to have with me (and with my loved ones).

Now I’m just hoping we see some monster steps forward in the digital workflow so we can quickly get these beautiful files off our cameras and into our projects to share with the world!

For examples of the kind of shots that are possible when the iPhone 14 Pro is in the hands of a professional and more details on the camera’s performance, be sure to visit Mann’s site.

Permalink

Halide 2.5 Adds New Macro Mode

Halide 2.5 is out, and it includes a brand new Macro Mode. Macro photography is an exclusive feature of the iPhone 13 Pro and 13 Pro Max. Still, Halide has managed to make its Macro Mode available on the iPhone 8 and newer models thanks to some cool machine learning tricks.

Switching to Macro Mode and dialing in precise focus is simple with Halide 2.5.

Switching to Macro Mode and dialing in precise focus is simple with Halide 2.5.

Macro Mode is easy to use. When you open the app, auto-focus (AF) is selected by default. Tap it, and the focus controls slide into place with the auto-focus at one end of the app’s focus dial and Macro Mode (the button with the flower) at the other end. Select Macro Mode, and you’ll see a new focus dial with smaller increments appear. The Halide team says this enables sub-millimeter focusing for extra-precise close-up focusing.

Halide takes its close-ups by first switching to the camera on your iPhone that can take the closest shots. Focusing is handled by its precision focus dial, and the final step is to enhance the image’s details using an AI-based enhancement process. That last super-resolution step is what allows Halide’s Macro Mode to be used on cameras on older models of iPhones and to enhance Apple’s own macro system too.

In my testing over the past day, the results have been impressive. I’m especially fond of the precise focus dial that allows for minute adjustments that make a difference at such close range.

If you’re a Club MacStories+ and Club Premier member, head over to the new Photography channel in our Club Discord to see even more of my experiments with Halide’s Macro Mode and share your own macro shots.

Halide is available as on the App Store as a subscription for $2.99/month or $11.99/year or for a one-time payment of $49.99. The app also offers a 7-day free trial.