Monthly Archives: May 2016

How I use Siri

According to Caitlin McGarry, a writer over at Macworld, Siri is seriously behind:

Google previewed its new voice assistant, the generically named Google Assistant, at its I/O developers conference on Wednesday, and while the assistant hasn’t actually launched yet, its features made Siri’s lack of functionality all the more obvious. Apple’s iOS assistant now lags so far behind not only Google, but Amazon’s Alexa, intelligent iOS apps like Hound, and new technology from Siri’s creators, that it’s unclear if Apple has any interest in catching up.

Siri usually reverts to a web search for questions it doesn’t know the answers to, and now I’ve come to expect that if I want anything more than a timer set or the weather forecast, I’ll have to look it up myself.

That’s silly. Linkbait. And here I am linking to it. I’ve not seen what Google previewed in a currently unavailable application but to jump to the conclusion that Siri is suddenly so far behind that Apple’s plans to keep it up-to-date are now in question. Seriously? Where do I wash this muck off.

And the suggestion that Siri is mostly just good for setting timers or getting the forcast only tells me that either Caitlin does not know how to use Siri or that she’s being intentionally dishonest.

I’ll agree that Siri could be better. Of course it could. That can be said about any of the other virtual assistant services currently available. No doubt Amazon has done a fantastic job with Alexa and has moved the bar. But to paint Siri as this useless, abandoned technology is just ridiculous.

As someone who actually uses Siri on a regular basis let me share just a small sampling of my requests. Yes, timers and alarms and the weather. And really, those are quite useful. But as an avid amateur astronomer I also routinely seek out quick facts. When I’m at the telescope I’m using my favorite astronomy app, Sky Safari, which has much of what I need. But there are times when I’m out for a walk and thinking about some aspect of astronomy. I have on occasion put together Keynote presentations for the local library and might be mulling over a few topics or possible content. For example: What is the largest moon of Jupiter? Siri comes back with an answer via Wolfram Alpha (Ganymede). Another: What is the distance to Pluto? Yup, again via Wolfram Alpha (32.41 astronomical units in case your curious). It gets better and better. Siri, when was the Mars Curiosity Rover launched into space? Yup, another answer from Wolfram Alpha:

img_5171.jpgIf I need the time that a planet is going to rise above the horizon because I’ve got friends coming over to spend some time at the telescope? Yup. I can get the time via Wolfram Alpha right in Siri’s search results. Now, to be sure, some questions do bring up web searches, usually with Wikipedia right at the top which is almost always what I want.

But I’m interested in more than astronomy. I have a keen interest in local flora and fauna. That flower over there that I think might be a gray headed coneflower? Or that bird that might be an indigo bunting? I can ask Siri for images and I have them presented right in the results.

Podcasts or music while I’m out for a walk? I can ask for a specific episode of a specific podcast or the most recent and it opens up nicely. I can have specific music played or a shuffle of everything.

What about local errands or day trips? If I’m interested in directions or the open hours of nearby businesses that usually works out. I’m in a small town in rural Missouri and with Yelp as the basis for these searches it can be hit and miss. If I ask “where is the nearest Trader Joes and is it open today?” I get the results with a map and Siri let’s me know the hours for today. I can tap the map for full directions. I’m not likely to drive that far but it’s possible I might drive into a closer town for Panera Bread. Again, I’m presented with the hours and a map which also indicates that they accept Apple Pay. If I ask about movies for that town I get a list of movies with nice movie art as well as times. If I click one I get more details, directions and an option to play the trailer. Maybe I decide to stay in and rent something on iTunes. A request brings up the information, artwork and option to rent or purchase. Maybe get some pizza? The phone number, map and hours all pop up.

I can do this all day long. Sports? I’m not into sports but I’ve played with it a bit just to see what Siri can produce. Excellent results. I grew up following baseball and that’s something I still follow a bit. If I’m curious about my favorite Cardinals player, Yadier Molina I can ask how he’s doing this year and I get a nice report:

img_5172.jpg

Recipes? Easy enough to find via a Siri web search. Of course, that goes for anything at all and I realize that while handy, initiating a web search is likely pretty low on the list of any voice assistant’s skills. I’m not going to suggest that Siri is perfect. If I ask her to make me lunch I get a simple no. If I ask her to water my garden, no again. But you see, those are obviously absurd requests of a virtual assistant. Actually, now that I think about it there might come a time when Siri could water my garden if and when HomeKit and the necessary components are made available but I don’t think that’s happened yet.

The takeaway here is that there is plenty that Siri can do and I suspect that Caitlin knows it. As a writer for Macworld I certainly hope she does. Anyone can activate Siri and ask “What can you do for me” and get a very helpful list that goes far beyond my examples. Sure, we all hope for more and no doubt more is coming. But for now I’ll happily go on using Siri in the ways that I know I can. I’ll even continue learning new ways to use “her” as I try new requests some of which will fail but many of which will helpfully provide information or complete a task I need help with. For that I’ll be grateful to Apple and it’s engineers.

Managing Websites with an iPad

Updating html with Editorial

Updating html with Editorial

One of the tasks I do fairly often that I expected would remain Mac-based is website management, specifically the updating of page content for clients. I’d tried it several times using several different apps starting with my first iPad. But it never seemed to work out. Yes, I could make it work but ultimately it was too many hoops, too much friction. I liked the idea of being able to make an emergency update should I ever be away from my Mac and only have the iPad with me. But that rarely happened. I’m not sure it ever happened.

So, every year or two I’ve made it a point to revisit this particular task and possible use for the iPad. When Panic released the first iteration of Coda for iOS a couple years ago, “Diet Coda” I was excited but it didn’t quite do the trick. Closer but still not there. In recent months as I’ve been leaning more on the iPad in daily use I figured it was time to revisit my search for a sensible workflow for the task. I’m a regular reader of Mac Stories and have always found Federico’s use of the iPad interesting. So I did a quick search there to see if this was something he addressed. He has but not in the way that I found helpful. He manages his one website and does it exclusively from his iPad. I need to manage many sites from at least two devices which means synced local files.

It took awhile but I’ve finally settled into something that works very well on the iPad. On the Mac most of my web work takes place using Panic’s excellent Coda App. I’d hoped to use the newest version of Coda for iOS and I have given it a fair shake but it’s not my tool of choice. Why? Well, I keep my source html files on Dropbox so that I can access them from any device for local editing. Coda for iOS does not offer an option for sourcing files from Dropbox. There is a pane for “local” files and server files. Bummer.

But the solution, for now, is another Panic app. Transmit for iOS. The app has a built in text editor which is far from powerful but offer’s the basic text editing I need to get the job done. Would be great if it included find and replace but it’s not a deal breaker. The big plus is that once I’ve edited a file on the server I can easily use the “Send to” function to save the file to Dropbox. Another function not in Coda.

If I prefer to start with my edits on the “local” side I can begin with Editorial which does allow for opening and saving to Dropbox. I can edit the “local” Dropbox copy and copy/paste the updated text into the file via Transmit. A little clumsy but it works pretty well. Unfortunately Editorial has not been updated to take advantage of iOS 9’s split screen mode. Not a huge deal as it’s fairly easy to just Command+Tab between Editorial/Transmit. Word is that there is a beta and a new version should be coming before too long. At that point I will split screen with Transmit. Open and edit my “local” Dropbox copy with Editorial which does have find and replace as well as syntax highlighting. An additional benefit with this that Editorial autosaves the file to Dropbox.

The two biggest downsides to the iPad-based workflow is that there is no site-wide search and replace. For that I’ll have to login via Coda on the Mac but that’s not something I do all that often. The second is that these workflows are technically only one html file at a time. In Editorial I can have all the site html files in the sidebar during editing. It’s easy enough to jump from one to another as needed like tabs. If there is a hang-up it would be clicking back and forth in Transmit to open a file on the server, then paste, then save, then close. But my usual updates are just two or three files per session so it’s not difficult to manage just not quite as easy as Coda on the Mac.

An alternative would be Textastic which is an excellent text editor on the iPad and has been updated to use split screen and has built in ftp. However, like Coda and Transmit it does not allow for using Dropbox as a “local” file store.

For images I’m using Pixelmator combined with a Workflow for changing the size, format and quality as needed. Just getting started with that and will see if it meets my expectations. So far it seems to.

Getting Caught Up

Nice View

The view from one of my many work areas.

Okay. Going to geek out for a moment. Been far too long since my last post. I’m not always the most consistent of bloggers. What can I say. Life happens. I’d like to say I’ll get more consistent but I can’t be certain. Aside from that, there are plenty of other excellent sites out there covering what I would likely cover in the way of Apple news. Well, regardless, I will continue to share occasional posts about recent projects and reflections.

So, yes, my last post was seven months ago and was just a bit of commentary on Apple’s releases of El Capitan, OS X 10.11, iOS 9 and the iPhone 6S. At the time I was expecting to do a series of projects which fell through. The 2014 rMBP that I purchased with those projects in mind went largely unused for the past 6 months. I did get some use out of it but not enough to continue keeping it. I only purchased it because circumstances at the time seemed to justify it. So, I’ll likely be selling it soon. I’ll continue on with the Mac Mini at my standing desk for much of my design work. The iPad Air 2 and iPhone 6 will continue to serve for anything else. Which brings me to mobile computing with iOS.

My first iOS device was the original iPad which I purchased as soon as it was released in 2010. I used it quite a lot, probably a 50/50 split with my previous laptop, a MacBook Air. My usage remained about the same when I upgraded to the iPad 3, the first with a retina screen. With the iPad Air 2 my usage patterns remained the same. Apple nerds have spent far too much time discussing whether the iPad is something that can be used for “real work”. Along the same lines, the question of whether the iPad should be considered a real “computer”. The answers are obvious. Yes and yes.

The first is, in part, context dependent. The iPad is great for some tasks not others. But this might also be said of a hammer or a bicycle or a boat or any other object. I don’t blend with a toaster and I don’t plant a tree with a screw driver. iOS devices, be they iPads or iPhones, are suited to particular tasks just as Macs are suited for others. I suppose all the discussion stems from the gradient of usage, the overlaps that are possible with the different platforms and form factors. The introduction of the large iPad Pro only made that more interesting.

What I’ve discovered in recent months is that my three primary computers all serve to compliment each other perfectly. It’s that simple. My Mac Mini is used for projects that require InDesign as well as web site management that requires site-wide search and replace which I do with Coda. It also serves as my iTunes/Plex server. Oh, and accounting via iBooks and occasional FileMaker work. My iPhone is for tracking my diet and steps, reading books, checking email, messaging, and a bit of web browsing. Oh, and the rare phone call. My iPad is for browsing the web via RSS or browser, reading books, messaging, phone calls, typing podcast transcripts, writing anything of length, and managing websites.

Some tasks/activities are best handled by two of these together. For example, astronomy sessions are a mix of iPhone and iPad. I use the iPad for recording data into FileMaker and some searching with Sky Safari Pro. I use the iPhone for much of the searching with Sky Safari Pro because it’s small and can be easily attached to the telescope. Much of my graphic design begins with Pages or Graphic (formerly called iDraw) or Pixelmator. In some cases I can complete the task completely on the iPad in other situations I transfer to the Mac to finish. An example would be the logo for Beardy Guy Creative. I did most of that using Graphic on the iPad then exported and finished with Illustrator because I’ve got many more fonts installed on the Mac.

At this point my preferred form factor is probably the iPad. It’s the device I choose to use most often as it strikes a nice balance of portability, flexibility and power. With iOS9 and multi-tasking via split screens, coupled with the extensions released with iOS 8, I find that the iPad is often up to the tasks I ask of it. That I can use it in a stand with an external keyboard or as a tablet is fantastic. The screen as computer form make the iPad the easiest to rearrange into a delightful variety of working arrangements. Of course simplest is holding it in my hands on a couch or chair but when used with a keyboard there is great benefit to being able to put it up high or off to the side or in any kind of position I need to be comfortable. Or, at a desk/table/shelf in a stand of course if I want or need such an arrangement.

One last point concerns the importance of adoption of new features of iOS. Particularly split screen and extensions.

I’m not sure why but I initially didn’t use split screen. I tried it a few times and figured it would be handy but I never made it a habit. In recent months as I began looking for a frictionless podcast transcription workflow I went from an iPhone/MBP/Mac Mini set-up to an all iPad set-up that relied on having Pages and the Apple Podcast app, each in split view and that sealed the deal. After using that for the past 6 weeks and NOW it clicks. Split screen is, without a doubt, the best part of the iOS 9 update for the iPad. Now that it’s become the basis of my transcript process it is finding it’s way into daily use for all sorts of tasks.

Extensions are something I’d dabbled in and put to some use. Initial discovery is so obvious and adds a lot of flexibility to the iOS workflow. And yet, there’s just a bit of complexity and I suspect I’m not not the only one who has taken some time to really work extensions into daily workflows. There’s a depth to the flexibility that is not initially obvious. The more I use them the more natural they seem and it contributes to a sense that the friction of iOS is slowly falling away.