Emoji can be hard to find from the iOS system keyboard. Although they have official names, emoji aren't accessible by those names from Apple's keyboard. Instead, if an emoji isn't in your frequently used, you are left with the task of remembering or guessing which category it falls into to find it. The trouble is, the groupings aren't that intuitive. Here's a test: Which category is sunglasses in? Objects or People?1Emojinn is a useful little utility that makes it easier to find the emoji you want without memorizing where they are.
Posts tagged with "iOS"
Derek Reiff, writing on The Omni Blog:
Recently customers have been wanting to take advantage of automation apps like Workflow, Drafts, Pythonista, and others to quickly add new actions or projects or switch to different views inside OmniFocus.
With 2.14, OmniFocus now includes best-of-class support for callback URLs. At its simplest, this means that you can create a workflow that adds more than one item to OmniFocus. But we didn’t just add support for two-way communication between OmniFocus and other apps, we added support for automating a whole lot more of the powerful capabilities of OmniFocus.
Ken goes into the nitty gritty in a detailed Discourse post. But aside from doing the usual name and note additions, you can add estimates, attachments, dates, repetition rules, flags, and even set a project to Parallel.
While I won't be switching away from 2Do, I've tried the beta of OmniFocus 2.14 and the new automation features on iOS are impressive. I think a lot of people are going to reconsider OmniFocus and take advantage of callbacks for app integrations.
The examples posted by Ken Case on the forums should give you an idea of the improved capabilities. It's no surprise that The Omni Group continues to redefine what making pro apps on iOS means.
It's also great to see TaskPaper being used as a structured text communication format – I'd like to see more developers follow this route.
Gifstory makes capturing your own GIFs easy by imposing constraints. One of the difficulties with GIFs is that there are lots of variables that impact how big they are and how good they look. I like GIF Brewery on the Mac, but it is easy to get caught up in tweaking those variables endlessly, trying to get a GIF that looks perfect. Gifstory, which is iPhone-only, eliminates the fiddling by imposing limits that work. Point your camera at something, press and hold the capture button, and you can capture a 320 × 426 or 320 x 320 GIF up to sixty frames long.
Agile Tortoise has teamed up with David Sparks of MacSparky.com to produce a series of screencasts introducing Drafts 4 to new users and highlighting some of its features. The first two screencasts were released today. The first video is an overview of how Drafts works, and some of the things you can accomplish with it. The second video focuses on using Drafts with Dropbox to save text as a separate file in Dropbox or append text to an existing text file.
David Sparks, who has made screencasts for companies like The Omni Group and Smile Software, does a great job of showing how easy it is to get started with Drafts, but also exposing some of the powerful ways Drafts can interact with other apps like Dropbox.
You can watch the first two Drafts videos below.
Microsoft is on a roll with iOS keyboards. Just over two weeks ago, Microsoft’s Garage project released Hub, which has tight integrations with Office 365. Today, Microsoft Garage released another keyboard called Word Flow, which is based on the Windows Phone keyboard from Windows Phone 8.1. According to The Verge, Microsoft had promised a public beta period for Word Flow, but skipped that step.
There are multiple options for interacting with Word Flow. Once installed, you can use it in one-handed mode where the keyboard can be displayed in an arc on the left or right side of the keyboard area. Alternatively, you can use a traditional keyboard layout.
In either case, Word Flow accepts both swiping and tapping input and has a strong predictive algorithm that anticipates the words you intend to type. In my limited tests, the one-handed mode was helpful for typing on my iPhone 6s Plus one handed and the predictive input was solid, though it didn’t seem to recognize its own name, failing to capitalize ‘flow’ in the screenshots above. Of course, for the predictive feature to work, you need to grant Word Flow full access to your keyboard input, which not everyone will be comfortable doing.
Word Flow also lets you set backgrounds behind your keyboard, including ones that are built into the app or your own photos. Personally, I think the backgrounds are ugly and distracting, but fortunately you can use Word Flow without a background.
Microsoft Word Flow is available on the App Store for free.
Update: Microsoft Word Flow is a US English keyboard and is only available in the US App Store.
When Apple Music debuted last summer I switched to it from Spotify. I wasn’t on Spotify all that long, but I did have a few playlists I wanted to take with me, including a big one with all the songs I had favorited. At the time, I found a script that logged into both services, tried to match the songs, and replicate the playlists on Apple Music. It worked reasonably well, but not great. SongShift automates that process. In my tests, SongShift did a solid job matching songs between Spotify and Apple Music, but because it is an import utility and not a sync service, it is a little cumbersome to use as a way to keep up with playlists you follow on Spotify that are frequently updated.
I almost can't believe there was a time when the iPad didn't have Split View. Or that, for years, I thought Apple would never bring split-screen multitasking to iOS.
More than any other iOS update before, iOS 9 has fundamentally reinvented the role of the iPad in my computing life. As I've written in my iOS 9 review and look at the iPad one year after my decision to stop using my Mac, iOS 9 and its multitasking improvements have catapulted the iPad away from the old limitations of iOS into a different league.
With iOS 9 and the iPad Pro, the list of tasks I can't perform on my iPad is down to a single item.
I've never been happier with an iOS device and operating system than I am with the iPad Pro and iOS 9. Getting my work done is faster; using apps in Split View is a treat compared to iOS 8; and the software keyboard (despite some problems) continues to impress me with the addition of trackpad mode and the Shortcut Bar. The iPhone 6s, too, has been a pleasant surprise thanks to its superior performance and 3D Touch.1
There's never been a better time to be an iOS user. But that doesn't mean that everything's perfect. When it comes to iOS, happiness is often a fleeting moment – a temporary satisfaction with the current state of things before the inevitable longing for something deeper. Such is the constant pursuit of the future.
Every year, I put together a list of the changes I'd like to see in the next version of iOS. I've been doing this for several years now. This year, I wanted to prepare something bigger. The tenth version of iOS due to be released later this year will be a major milestone for Apple and iOS users. It only felt appropriate to celebrate the occasion with a different take on my annual iOS wish list.
For the past few months, I've been collaborating with Sam Beckett (author of a fantastic Control Center concept we linked to a while back) to visualize my iOS 10 wishes with a professional concept video and static mockups. Sam and I discussed my ideas for a couple of months, and he was able to visualize2 what I would like to have in iOS 10 – both for the iPhone and iPad – with a style and attention to detail I'm extremely happy with.
Below, you'll find a collection of my iOS 10 wishes, organized in tentpole features (the ones also shown in the video) plus additional sub-sections. Some of these wishes have been on my list for years; others are a consequence of the features Apple shipped with iOS 9.
Let's get started.
- I use 3D Touch heavily every day. Peek previews for tweets and links in Tweetbot, Home screen actions, keyboard cursor control, and channel shortcuts in Slack are gestures I intuitively perform daily. ↩︎
- He was also very patient. As you can imagine, I sent him a lot of ideas and comments before signing off on the final version. I can't recommend working with Sam enough. ↩︎
Stanford University has published an iOS programming course (CS193P) on iTunes U annually since the very first iPhone SDK was released. Back then, the course was taught by Apple engineers, which was unheard of at the time, and a perhaps a sign of greater openness to come. There are still a lot of good basic lessons on Objective-C, model-view-controller patterns, and other fundamentals in that first lecture series, even though Cocoa Touch APIs have changed substantially over the years. But, perhaps my favorite lecture from that first class is a short talk Loren Brichter gave on the development of Tweetie, his Twitter client that was the first app to feature pull-to-refresh and was eventually purchased by Twitter.
Now, seven years later, Stanford has begun posting lectures for Developing iOS 9 Apps with Swift. It helps to have some basic object oriented programming experience before taking on this class, but don't let that discourage you. When I started teaching myself programming I watched that first lecture series over and over, stopping to research things I didn't understand as I went. And even if you're not interested in learning to program for iOS, go back and watch Loren Brichter's talk, it's a fascinating time capsule of how far iOS has come and the clever tricks programmers used in 2009 to get around the technical limitations of early iPhones.
Last week, Story & Pixel announced that it is pushing to release the upcoming documentary App: The Human Story in early June, just before WWDC. I remember being at The Talk Show live recording at WWDC in 2014 when Jake Schumacher and Jedidiah Hurt, the team behind Story & Pixel, captured some of their first footage for the film. That was followed by a successful Kickstarter later that summer, and a whole lot of work since then. This past Fall, Story & Pixel debuted fifteen minutes of opening footage for App at XOXO Fest in Portland.
During production of App, the Story & Pixel team needed a discrete way to timestamp footage while they were filming as a sort of bookmarking system that would allow them to quickly jump to certain footage during the editing process. They tried things like taking notes, but found that it disrupted interviews. To solve the problem, Story & Pixel, working with developer Ryan Newsome, created an iPhone app with four large buttons that can be discretely tapped while filming to create up to four different types of bookmarks. The bookmarks can be exported into Adobe Premiere Pro (subscription required) when you're ready to edit your footage.1
The app, called Quantify, launched last week and is free with a choose-your-own-price subscription model. Subscribing adds the ability to export to Adobe Premiere Pro and premium support.
App has been in production for almost two years now and it's great to see that Story & Pixel are in the final home stretch. Judging from the footage that has been previewed so far, I think we are in for a real treat.
- One thing that immediately occurred to me when playing around with Quantify is that something like this would be handy for podcasters who want to mark portions of audio that need editing, contain information that should be added to show notes, or where a chapter marker could be added. ↩︎