Trying Drafts Again

(Note: I started this post back in February but never published. It’s now April and Drafts 5 has just recently been released!)

First, I’ll say that I make it a point to not clutter up my devices with apps I don’t use. There’s a balance to be found finding apps that work and sticking to them but also remaining open to discovering new apps. Early on with iPad then iPhone I tried lots of apps that I didn’t use for long and then I gradually settled to a fairly small subset that fill most of my needs. But being a nerd there is a constant pushback. Reading and listening to Federico Viticci makes this even more difficult. He’s constantly experimenting. So much so that I don’t know how he get’s any sleep given what he produces. He had a recent write-up on new automation in Things 3.4 which I sorta use and in it he also touched on the up-coming Drafts 5.

Drafts. This is one of those apps I bought but that I never use. It fits into the territory of Notes and Evernote. A few years back I tried Evernote for a few months but gave up on it. Why? Notes. I’ve always gravitated back to Apple Notes. And it’s even better now with scanning, searchable pdfs, images, etc. That said Drafts has a few advantages for just working with text, particularly Markdown. But then it crosses over into territory that is also occupied by iA Writer which is what I use for blogging and podcast transcription. So, it becomes a case of is there really a place for it or is it just clutter?

The key in determining if it will be useful for me will be the actions it is capable of. The primary purpose behind Drafts is that it is meant to be a place for quickly capturing text which can be built on or sent on to another app. Certainly, the quick capture is true. When I tried it in the past I looked at the automations and thought yes, they would be useful but most seemed to be basic feeds to other apps. Which is the point and which also had me questioning the usefulness. Why not just start and finish the email in Mail? Start and finish the post in Micro.Blog? The tweet in Twitterrific. The event in Fantastical. The to-do in Things. Again, that is the whole point of Drafts. It is for people who want to go to one place to start every action. Hence the name. You start a draft which you then send on to its final destination. Finally, the light goes off in my head. It took too long and it’s pretty dim. But there it is.

Okay, okay. So, now that I finally get this simple point and purpose, will I fit it in? I admit I’m curious. Over the past couple days I’ve made it a point to try. I added a couple of items to Things via Drafts. I added a couple of items to my calendar via the Drafts to Fantastical action. I even created a blog post which I saved to my iA Writer directory on iCloud. I hopped over to iA Writer and opened it and posted to WordPress. I’m going to make an effort for the next week or so to start with Drafts. That should be enough time to make it a part of the routine and better get to know the app and what it is like to use and whether it reduces friction or increases it. Some of these actions are the sort of things I increasingly accomplish with Siri. When I added the events to my calendar using Drafts I had to deliberately stop myself from using Siri. Fine for the purposes of evaluation but day-to-day I’d likely just use Siri.

April 19th update. Well, I wrote the above but never published it. Did I use Drafts much in the 50 days since writing the above post? Some but not much. The final version was just released yesterday so of course it’s all over my RSS and Twitter as the nerds go nuts for the latest text app. I spent some time reading the review by Tim Nahumck over at MacStories. I’ve read a few other things and watched a couple videos. I decided that to give it a fair shake I needed to move it into the Dock where it now sits by iA Writer which would be the app it would potentially replace.

Comparing iA Writer and Drafts Both apps have very pleasant writing environments. Both have features the other does not so there will be trade offs as is always the case when choosing between apps of any kind.

I use iA Writer to write and publish to my two WordPress blogs and for podcast transcripts which get exported to pdf and html. It works very well for those tasks. How does Drafts do? With Drafts I can print to pdf and export to html (both are actions downloaded from the action directory). There is no built in blog publishing other than sharing via the share sheet to the WordPress which is very limited. That said, there is an action to copy as rich text. From there it is a simple step to switch to Safari or the WordPress app, create a post and paste. When I use iA Writer I’m taken to Safari anyway for a final check before I post from within Safari. So, either way, it’s essentially the same.

Document storage is another consideration. Here I give the edge to iA Writer which autosaves and stores all of it’s files as text files in iCloud which is a huge plus. Also, iA Writer documents in the app can be stored in folders and those folders also exist in the Files app on iCloud. By comparison, Drafts keeps its documents in its own synched database and does not offer folders other than the default four which are Inbox, Flagged, Archive and Trash. Organization in Drafts can be accomplished via tags though and that’s potentially very useful and potentially even more powerful than folder-based organization. If I need to I can save my documents as txt to any location which is an added step by comparison to the native text files used by iA Writer but it’s a pretty simple step.

The greatest benefit to using Drafts would be the more customizable interface and the extensibility of actions. The whole point of the app (originally) was a place to start text so that it could then be used in a variety of ways via sharing. I’ll add that getting text into Drafts is much easier via other apps’ share sheets. I often want to share text from Safari for a blog post. With Drafts I can select text and the share sheet gives me that selection as well as the markdown link at the top. Very handy. With iA Writer this is not possible. I find it hard to believe that the developers of iA Writer have not enabled receiving text from other apps via share sheets! I can copy/paste or drag and drop but it’s extra work. A big plus for Drafts on this.

Of course, there is far more to both of these apps, I’m only touching on the most obvious features in regards to my typical usage. I really like the feel of both of them. Very pleasant to write in and easy to use. They both stay out of the way but provide enough interface to make formatting markdown easy.

Subscriptions Done Right

Pricing on Drafts 5 has definitely gone up regardless of the subscription. Version 4 has been on sale for 3 to 4 years at $5. Cost now is $20 a year which is still more affordable than Ulysses but 4 times the cost of the previous version. Given that it is per year, it would be $60 for 3 years compared to $15 at the previous rate (assuming $5/per year). I think a part of the negative reaction to subscriptions is that they seem to be price increases at the same time. Every user will have their own line based on usefulness and budget. For my I purposes of blogging and transcripts I could just as easily rely on Pages or Notes. This kind of app is optional for me and I wouldn’t want to pay more than $10/year. The previous price of $5 was too little especially given it was a one time purchase. I think this time he’s jumped just a little too far the other direction. But that’s my judgment from my perspective.

All that said, while Drafts 5 is a subscription I actually like the way it’s being done. I can use the nearly full functionality of the app without a subscription. All the essential stuff works and some of the “extras” too. The pro subscription is for the advanced feature set that I may never use. For those that use those pro features the subscription is a great way to support continued development. If I find myself using the app, even just the basic functionality, I’ll likely subscribe for at least a few months to contribute to the development. I like that I have the option to drop out of the subscription and continue creating with the app. With Ulysses’ subs I felt locked in, restricted to viewing and exporting only, and so I stopped using it the day they switched to subscriptions. Yes, I know I could have continued using the version I had till it stopped working with a future iOS update. But I didn’t see the point of it given my eventual departure. Why lock-in all my writing when the end result would be the same thing: moving to a different app.

A Good Problem to Have

It’s great that there are so many fantastic apps being developed for iOS. I’d much rather have too many to choose from than not enough. I’m looking forward to spending more time with Drafts. I’m curious to see if it becomes a habit. I’m used to going to apps and I suspect I’ll continue to do so. That said, I see the merit of having one place to start all text which then get’s pushed out to other apps. Time will tell.

Apple’s Education Commitment

The March 27 Apple education event has come and gone and I’ve taken a couple weeks to digest others’ responses which have been generally positive. My own response, at the time of the event was very positive and after a couple of weeks I continue to be so. There’s a lot to unpack so I’ve taken my time mulling it over. Also, I should say, I’m not a teacher in any school. But education is dear to me and I’ve spent many, many hours of my life encouraging life long learning in every community I have lived in. I’ve helped homeschool kids and volunteered at an adult literacy program, led group workshops for all ages and tutored one-on-one. Education is something I’ve done because I deeply value personal growth. I want the humans around me to strive towards the fullest expression of their potential and if I can be a part of that process I consider myself very fortunate.

Okay, enough of that. Let’s get back to Apple’s announcements.

**Everyone Can Create curriculum **
Apple has added a new curriculum along side of their previously introduced Everyone Can Code. This new branch is exactly what we would expect as the follow-up: Everyone Can Create. They created a video to tell this story and it’s a lot of fun:

Apple has this to say about the importance of creativity in the learning process:

After 40 years working alongside educators, we’ve seen — and research has shown — that creative thinking leads to deeper student engagement. And when students are more engaged, they take more ownership of their learning. Creative skills help students become better problem solvers, communicators, and collaborators. They explore more and experiment more. They tell richer stories and find their own unique voices. They stretch their imaginations and make connections they might not otherwise make — and carry all these skills through everything they’ll do in school. And beyond.

They’ve gone to great lengths to highlight the iPad as the best computer for students to have in their endeavors. From form factor to app ecosystem, and I think it’s true. While Chromebooks have gained market share due to the cheap price, ease of management, and the covering of conventional academic needs such as the writing of papers. Stuck in the laptop form factor Chromebooks are good for sitting on desks and doing inside, at a desk tasks. Apple continues to position the iPad as the tool that goes beyond the desk, beyond boundaries. It’s the tool that kids can take outside to record video or sketch or paint or photograph or record audio. It’s the tool that can be used to assemble those videos, sketches, paintings, photographs and audio recordings into a variety of academic presentations or reports or even books.

But Apple isn’t just putting the device out there. They’ve taken the added step of providing lessons to help the process along. The Teacher Guide preview for the Everyone Can Create curriculum looks pretty fantastic.

Designed with the help of educators and creative professionals, Everyone Can Create introduces the fundamental skills and techniques of video, photography, music, and drawing. Students will use free apps available on any iPad, like Clips and GarageBand, taking advantage of the built-in camera, microphone, speakers, Multi-Touch display, and support for Apple Pencil. The curriculum also offers materials to help teachers infuse these skills into the subjects they teach every day. So students can use musical rhythm to make a math equation more memorable, produce a video to re-create a turning point in history, or use drawing to illustrate a city’s changes in architecture over time.

I’ve been reading through it and to this layman’s eyes it looks like a really engaging creative process to enhance the learning of material that might otherwise be learned in more traditional ways, namely the taking of notes from lectures and books which lead to essays and papers. This curriculum is Apple’s recipe for using the creative arts to build a new process

There’s a lot more on the Apple More for Teacher’s resource page.

Cost
One persistent criticism of iPad-centered curriculum is the price. Quite a few have commented that it’s too much, especially for an education system that is cash-strapped. Oh, I’ve got some thoughts on this. Boy do I. Our lack of funding for education is nothing more than a political problem that is immediately fixable. The fact that the U.S. has, for decades, chosen to grossly outspend every other nation on the planet in it’s spending on its military is THE direct cause for our under-funded education system. This is not Apple’s fault and not Apple’s problem. Period.

My proposal is an immediate cut to military funding by 40%. Yes, 40%. Then 50% and then 60% and then 70%. Let’s put that money into education, healthcare, and other humanitarian programs. There’s no reason, none at all, that our education system should be anything but fully funded and such a system could afford iPads for ever student and far more than that.

Until such a time as we make better, more ethical choices about our national priorities schools will continue to go underfunded. In that environment many schools will not be able to go all in on iPads. For many of the poorest schools even the cheapest Chromebooks might be out of reach.

Creativity with iPad

I’ve used the iPad for a variety of creative endeavors over the years. From video editing to design projects to paintings of nebulae in deep space. It’s a fantastic creative tool and with the latest iPad, the Pencil is now available for the base model. Serenity Caldwell’s review of the 2018 iPad and the Apple Pencil released at Apple’s Education event is an excellent example of what is possible with this new device. Written, edited and completely produced using the new iPad. I’m a big fan of iMore in general and Serenity in particular. She’s always thorough and offers a balance of positivity and critique that I’ve come to appreciate. She really digs into what can be done with these devices and steps outside of the usual written review. Actually, she often writes a review too but she does’t stop with the written word. From her illustrated review of the Apple Pencil to this most recent review, she really explores the creative potential using the device being reviewed. With her current video review she also offers a detailed description of the process that she used. Very helpful for anyone wanting to learn more about how to create with their iPad.

Serenity also put together round-up of tips, techniques, apps and website resources for those that want to learn how to draw using an iPad and Pencil. I’m going to do a post soon about my recent exploration of lettering using the Pencil and iPad. I’ve long avoided handwriting in favor of the keyboard. My handwriting, never good, has only gotten worse. Via a tweet by Matt Gemmell I recently discovered a free video tutorial for brush lettering using the iPad and it’s been fun.

It goes without saying but I’ll say it anyway, of all the Apple computing devices I’ve owned the iPad is, by far, my favorite. Just a few years ago I never would have thought my favorite would be anything other than a Mac. 

Apple’s Renewable Energy

Apple’s attention to the details of its environmental impacts has become one of the best things about the company. They are in a position to have a substantial impact and they are pushing forward constantly. The latest news is that the Apple now globally powered by 100 percent renewable energy. But even better, they are getting their suppliers into clean energy and as of today nine more of its suppliers have committed to 100% clean energy production.

“We’re committed to leaving the world better than we found it. After years of hard work we’re proud to have reached this significant milestone,” said Tim Cook, Apple’s CEO. “We’re going to keep pushing the boundaries of what is possible with the materials in our products, the way we recycle them, our facilities and our work with suppliers to establish new creative and forward-looking sources of renewable energy because we know the future depends on it.”

Renewable-Energy-Apple_AP-Solar-Panels_040918

Apple’s new headquarters in Cupertino is powered by 100 percent renewable energy, in part from a 17-megawatt onsite rooftop solar installation.

Apple and its partners are building new renewable energy projects around the world, improving the energy options for local communities, states and even entire countries. Apple creates or develops, with utilities, new regional renewable energy projects that would not otherwise exist. These projects represent a diverse range of energy sources, including solar arrays and wind farms as well as emerging technologies like biogas fuel cells, micro-hydro generation systems and energy storage technologies.

It goes without saying that all companies should follow Apple’s lead.

Roundup of recent articles and podcasts

We’ll start with MacStories which has been very busy and churning out articles I’ve really enjoyed.

Most recently, Federico Viticci hit on a topic that I also recently wrote about. Of course, his article is of much greater length and detail (when are his articles not of great length and detail?). His article, Erasing Complexity: The Comfort of Apple’s Ecosystem is an excellent read:

There are two takeaways from this story: I was looking for simplicity in my tech life, which led me to appreciate Apple products at a deeper level; as a consequence, I’ve gained a fresh perspective on the benefits of Apple’s ecosystem, as well as its flaws and areas where the company still needs to grow.

After a couple of years experimenting with lots third party hardware and apps he’s simplifying:

But I feel confident in my decision to let go of them: I was craving the simplicity and integration of apps, services, and hardware in Apple’s ecosystem. I needed to distance myself from it to realize that I’m more comfortable when computers around me can seamlessly collaborate with each other.

I’ve never gone to the lengths that he has. I don’t have the money, time or the inclination for such far ranging experimentations, be they apps or hardware. But I’ve dipped my toes in enough to know that constant experimentation with new apps takes away from my time doing other things. At some point experimentation becomes a thing unto itself which is fine if that’s something one enjoys. I think many geeks fall into this.

His conclusion is spot on:

It took me years to understand that the value I get from Apple’s ecosystem far outweighs its shortcomings. While not infallible, Apple still creates products that abstract complexity, are nice, and work well together. In hindsight, compulsively chasing the “best tech” was unhealthy and only distracting me from the real goal: finding technology that works well for me and helps me live a better, happier life.

This tech helps us get things done. It is a useful enhancement but it is not the end goal.

A week or so ago Apple announced an upcoming event for March 27, centered on education and taking place in Chicago. There’s a lot they can do in this area but they haven’t provided much detail about the event so of course there’s been LOTS of speculation. John Voorhees of MacStories has a fantastic write-up of his expectations based on recent history in the education tech area as well as Apple’s history in education. He think’s the event will “Mark a milestone in the evolution of it’s education strategy”:

However, there’s a forest getting lost for the trees in all the talk about new hardware and apps. Sure, those will be part of the reveal, but Apple has already signaled that this event is different by telling the world it’s about education and holding it in Chicago. It’s part of a broader narrative that’s seen a shift in Apple’s education strategy that can be traced back to WWDC 2016. Consequently, to understand where Apple may be headed in the education market, it’s necessary to look to the past.

It’s a great read. The event is this week so we’ll know more soon.

With the topic of Apple and education there’s been a lot of talk about Google’s success with Chromebooks in education. As the story goes, many schools have switched because the Chromebooks are cheap, easy to manage and come with free cloud-based apps that teachers (and school staff) are finding very useful. Another one of my favorite Apple writers is Daniel Eran Dilger over at Apple Insider and he’s got a great post challenging the ongoing narrative that Apple in dire straights in regards to the education market. Specifically the current popular idea that Apple should drop it’s prices in a race to the bottom with companies that sell hardware for so little that they’re making little to no profit. How is “success” measured in such spaces? Dilger covers a lot of ground and it’s worth a read in terms of having more context, current and historical, for that market area. He’s got another recent post about Google’s largely failed attempt at entering the tablet market in general. Google gives up on tablets: Android P marks an end to its ambitious efforts to take on Apple’s iPad

Rene Ritchie over at iMore continues to do a fantastic job both in his writing and podcasting. His recent interview with Carolina Milanesi on the subject of Apple and education is excellent. It’s available there as audio or transcript. I found myself agreeing with almost everything I heard. Carolina recently posted an excellent essay on tech in education over at Tech.pinions..

One thing in particular that I’ll mention here: iWork. I love the iWork apps and have used them a lot over the years. That said, I agree with the sentiment that they are not updated nearly enough. I would love for Apple to put these apps up higher in the priority list. Would be great to see the iPad versions finally get brought up to par with the Mac versions.

Rene also did another education related podcast interview, this one with Bradley Chambers who’s day job is Education IT.

Siri and the iOS Mesh

Over the past couple years it’s become a thing, in the nerd community, to complain incessantly about how inadequate Siri is. To which I incessantly roll my eyes. I’ve written many times about Siri and it’s mostly positive because my experience has been mostly positive. Siri’s not perfect but in my experience Siri is usually a pretty great experience. A month ago HomePod came into my house and so I’ve been integrating it into my daily flow. I’d actually started a “Month with HomePod” sort of post but decided to fold it into this post because something shifted in my thinking about it over the past day and it has to do with Siri and iOS as an ecosystem.

It began with Jim Dalrymple‘s post over at The Loop: Siri and our expectations. I like the way he’s discussing Siri here. Rather than just complain as so many do he’s breaking it down in terms of expectations per device and the resulting usefulness and meeting of expectations. To summarize, he’s happy with Siri on HomePod and CarPlay but not iPhone or Watch. His expectations on the phone and watch are higher and they are not met to which he concludes: “It seems like such a waste, but I can’t figure out a way to make it work better.”

As I read through the comments I came to one by Richard in which he states, in part:

“I’ve improved my interactions with Siri on both my iPhone 8 and iPad Pro by simply avoiding “hey Siri” and instead, holding down the home button to activate it. Not sure how that’s done on an iPhone X but no doubt there’s a way….

A lot of folks gave up on Siri when it really sucked in the beginning and like you, I narrowed my use to timers and such. But lately I’m expanding my use now that I’ve mostly dumped “hey Siri” and am getting much better results. Obviously “hey Siri” is essential with CarPlay but it works well there for some odd reason.”

Since getting the HomePod I’ve reserved “Hey Siri” for that device and the watch. My iPads and iPhone are now activated via button and yes, it seems better because it’s more controlled, more deliberate and usually in the context of my iPad workflow. In particular I like the feel of activating Siri with the iPad and the Brydge keyboard as it has a dedicated Siri key on the bottom left of the keyboard. The interesting thing about this keyboard access to Siri is that it it feels more instantaneous.

Siri is also much faster at getting certain tasks done on my screen than tapping or typing could ever would be. An example, searching my own images. With a tap and a voice command I’ve got images presented in Photos from whatever search criteria I’ve presented. Images of my dad from 2018? Done. Pictures of dogs from last month? Done. It’s much faster than I could get by first opening the Photos app and then tapping into a search. Want to find YouTube videos of Stephen Colbert? I could open a browser window and start a search which will load results in Bing or type in YouTube and wait for that page to load then type in Stephen Colbert and hit return and wait again. Or, I can activate Siri and say “Search YouTube for Stephen Colbert” which loads much faster than a web page then I can top the link in the bottom right corner to be taken to YouTube for the results.

One thing I find myself wishing for on the big screen of the iPad is that the activated Siri screen be just a portion of the screen rather than a complete take-over of the iPad. Maybe a slide-over? I’d like to be able to make a request of Siri and keep working rather than wait. And along those lines, if Siri were treated like an app allowing me to go back through my Siri request history. The point here is that Siri isn’t just a digital assistant but is, in fact, an application. Give it a persistent form with it’s own window that I can keep around and I think Siri would be even more useful. Add to that the ability to drag and drop (that would come with it’s status as an app) and it’s even better.

Which brings me to voice and visual computing. Specifically, the idea of voice first computing as it relates to Siri, HomePod and others such as Alexa, Google, etc. After a month with HomePod (and months with AirPods) I can safely say that while voice computing is a nice supplement to visual for certain circumstances, I don’t see it being much more than that for me anytime soon, if ever. As someone with decent eyesight and who makes a living using screens, I will likely continue spending most of my days with a screen in front of me. Even HomePod, designed to be voice first, is not going to be that for me.

I recently posted that with HomePod as a music player I was having issues choosing music. With an Apple Music subscription there is so much and I’m terrible at remembering artist names and even worse at album names. It works great to just ask for music or a genre or recent playlist. That covers about 30% of my using playing. But I often want to browse and the only way to do that is visually. So, from the iPad or iPhone I’m usually using the Music app for streaming or the remote app for accessing the music in my iTunes library on my MacMini. I do use voice for some playback control and make the usual requests to control HomeKit stuff. But I’m using AirPlay far more than I expected.

Music

Using the Music app and Control Center from iPad or iPhone is yet another way to control playback.

Apple has made efforts to connect our devices together with things such as AirDrop and Handoff. I can answer a call on my watch or iPad. At this point everything almost always remains in constant sync. Moving from one device to another is almost without any friction at all. What I realize now is just how well this ecosystem works when I embrace it as an interconnected system of companions that form a whole. It works as a mesh which, thanks to HomeKit, also includes lights, a heater, coffee maker with more devices to come in the future. An example of this mesh: I came in from a walk 10 minutes ago and I was streaming Apple Music on my phone, listening via AirPods. When I came inside I tapped the AirPlay icon to switch the audio output to HomePod. But I’m working on my iPad and can control the phone’s playback via Apple Music or Control Center on the iPad or, if I prefer, I can speak to the air to control that playback. A nice convenience because I left the phone on the shelf by the door whereas the iPad is on my lap.

At any given moment, within this ecosystem, all of my devices are interconnected. They are not one device but they function as one. They allow me to interact visually or with voice with different iOS devices in my lap or across the room as well as with non-computer devices in HomeKit which means I can turn a light off across the room or, if I’m staying late after a dinner at a friends house, I can turn on a light for my dogs from across town.

So, for the nerds that insist that having multiple timers is very important, I’m glad that they have Alexa for that. I’m truly happy that they are getting what it is they need from Google Assistant. As for myself, well, I’ll just be over here suffering through all the limitations of Siri and iOS.

Revisiting iTunes with HomePod

Like many I’ve been using iTunes since it’s first versions. Over the past year that use dwindled a great deal as my music playing was mostly via Music on an iOS device. And in the couple years before that I’d been using Plex on iOS devices and AppleTV to access my iTunes library on the Mac because, frankly, the home sharing was pretty crappy. Alternatively, I would also use the remote app on an iOS devices to control iTunes on the Mac which also worked pretty well. The downside was that I didn’t have a decent speaker. I alternated between various (and cheap) computer and/or bluetooth speakers and the built-in TV speakers. None of them were great but they were tolerable. I live in a fairly small space, a “tiny house” so even poor to average speakers sound okay.

Today I’ve got the HomePod and after a year of enjoying Apple Music on iPads and iPhone I’ve added lots of music that I usually just stream, often from my recently played or heavy rotation lists. But two things have surfaced now that I’ve been using HomePod for a few days. First, as I mentioned in my review of HomePod, I’m not very good at choosing music without a visual cue. Second, I live in a rural location and when my satellite data allotment runs out streaming Apple Music becomes less dependable. Sometimes it’s fine. Sometimes not. Such was the case last night. So, after a year away from the iTunes library on my Mac I opened the Apple Remote app. I set the output for iTunes to the HomePod and spent some time with my “old” music all streaming to the best speaker I’ve ever owned. So nice.

This morning it occurred to me that while I’m back on my “bonus” data (2am-8am) I should consider downloading some of the music I’ve added to my library over the past few months of discovery through Apple Music. And in looking at that list I see all that with each month or two, as I’ve discovered new music the previous new discovery’s roll out of my attention span. There are “new” things I discovered 5 months ago that I enjoyed but then forgot. It’s a great problem to have! So I’ve spent the morning downloading much of the music I’ve added to my library over the past year. I never intended to actually download any of it as the streaming has worked so well. But with the HomePod I see now that keeping my local iTunes/Music library up-to-date has great benefit.

Remote App

So, how well does this new music playing process work? I rarely touch my Mac. It’s a server and I use it for a few design projects that I cannot do on my iPad. So, as mentioned above, I’ve been using the original iOS “Remote” app which opens up an iTunes like interface and which works very well to choose music on the Mac which plays via AirPlay to HomePod. Of course I can still use Siri on the HomePod to do all of its normal features. The only thing I do on the Remote app is choose the music. Apple’s not done much with the interface of that app so it looks pretty dated at this point. Actually, it looks very much like iTunes but a slightly older version of iTunes. But even so being able to easily browse by albums, artists, songs, playlists is very comfortable. It fits a little better to my lifelong habit of choosing music visually.

Why not just access my local iTunes music via the Home Sharing tab in Apple Music app on my iPad or iPhone which could then be AirPlayed to the HomePod? I guess this would  be the ideal as it would allow me to stay in the Apple Music app. Just as the Remote app allows for browsing my Mac’s music library so too does the Music app. But performance is horrendous. When I click the Home Sharing tab and then the tab for my Mac Mini I have to wait a minimum of a minute, sometimes more for the music to show up. Sometimes it never shows up. If I tap out of the Home Sharing library I have to wait again the next time I try to view it. It’s terrible. By contrast, the Remote app loads music instantly. There is, at most, a second of lag.

But what’s even worse is that the Music app does not even show any of the Apple Music I’ve downloaded to my local iTunes library. It really is a terrible experience and I’m not sure why Apple has done it this way. So, the Remote app wins easily as it actually let’s me play all of my new music and does so with an interface that updates instantly even if it is dated.

I suspect that my new routine will be to use Apple Music and discovery via Apple’s playlists and suggested artists when I’m out walking which is usually a minimum of an hour a day. My favorite discoveries will get downloaded to my local library and when at home I’ll spend more time accessing my iTunes library via the remote app. All in all, I suspect that I’ll be enjoying more of my library, old and new, with this new mix and of course, all of it on this great new speaker!

HomePod: Sometimes great, sometimes just grrrrrrrrrrr.

Tuesday Morning
I’m getting out of bed as two dogs eagerly await a trip outside which they know will be followed by breakfast. I ask Siri to play the Postal Service. She responds: “Here you go” followed by music by the Postal Service. The music is at about 50% volume. Nice. But in three full days of use I’m feeling hesitant about HomePod and the Siri within. And the next moment illustrates why. I slip on my shoes and jacket and ask Siri to Pause. The music continues. I say it louder and the iPhone across the room pipes up: “You’ll need to unlock your iPhone first.” I ignore the iPhone and look directly at HomePod (5 feet away) and say louder as I get irritated “Hey Siri, pause!” Nothing. She does not hear me (maybe she’s  enjoying the music?). By now, the magic is long gone and is replaced by frustration. I raise my voice to the next level which is basically shouting and finally HomePod responds and pauses the music. Grrr.

I go outside with my canine friends and upon return ask Siri to turn off the porch light. The iPhone across the room responds and the light goes off. I ask her to play and the HomePod responds and the Postal Service resumes. I get my coffee and iPad and sit down to finish off this review. I lay the iPhone face down so it will no longer respond to Hey Siri. Then I say, Hey Siri, set the volume to 40%. Nothing. I say it louder and my kitchen light goes off followed by Siri happily saying “Good Night Enabled”. Grrrrrrrrrrrr. I say Hey Siri loudly and wait for the music to lower then say “set the Kitchen light to 40%” and she does. The music resumes and I say Hey Siri and again I wait then I say “Play the Owls” and she does. I’d forgotten that I also wanted to lower the volume. But see how this all starts to feel like work? There’s nothing magical or enjoyable about this experience.

Here’s what I wrote Sunday morning as I worked on this review:

“When I ordered the HomePod I had no doubt I would enjoy it. Unlike so many that have bemoaned the missing features I was happy to accept it for what Apple said it was. A great sounding speaker with Apple Music and Siri. Simple.

It really is that simple. See how I did that? Apple offered the HomePod and I looked at the features and I said yes please.

I then proceeded to write a generally positive review which is below and which was based on my initial impressions based on 1.5 days of use. By Monday I’d edited to add in more details, specifically the few failures I’d had with Siri and the frustration of iPhone answering when I didn’t want it to.

I went into the HomePod expecting a very positive experience. And it’s mostly played out that way. But it’s interesting that by Tuesday morning my expectation of failure and frustration have risen. Not because HomePod is becoming worse. I’d say it’s more about the gradual accumulation of failures. They are the exception to the rule but happen often enough to create a persistent sense of doubt.

Set-up
As has been reported. It’s just like the AirPods. I was done in two minutes. I did nothing other than plug it in and put my phone next to it. I tapped three or four buttons and entered a password. Set-up could not possibly be any easier.

Siri
In a few days of use I’m happy to report that HomePod has performed very well. In almost every request I have made Siri has provided exactly what I asked. My hope and expectation would be that Siri on HomePod would hear my requests at normal room voice. While iPad and iPhone both work very well, probably at about 85% accuracy I have to be certain to speak loudly if I’m at a distance. Not a yell1, but just at or above normal conversational levels. With HomePod on a shelf in my tiny house, Siri has responded quickly and with nearly 100% accuracy and that’s with music playing at a fairly good volume. Not only do I not have to raise my voice, I’ve been careful to keep it at normal conversational tones or slightly lower. I’ll say that my level is probably slightly lower than what most people in the same room would easily understand with the music playing.

For the best experience with any iOS device I’ve learned not to wait for Siri. I just say Hey Siri and naturally continue with the rest of my request. This took a little practice because earlier on I think Siri required a slight pause or so it seemed. Not any more. But there’s no doubt, Siri is still makes mistakes even when requesting music which is supposedly her strongest skill set.

The first was not surprising. I requested music by Don Pullen, a jazz musician that a friend recommended. I’d never listened to him before and no matter how I said his name Siri just couldn’t get it. She couldn’t do it from iPhone or iPad either. Something about my pronunciation? I tried, probably 15 times with no success. I did however discover several artists with names that sound similar to Don Pullen. I finally turned on type to Siri and typed it in and sure enough, it worked. I expect there are other names, be they musicians or things outside of Music that Siri just has a hard time understanding. I’ve encountered it before but not too often. The upside, the next morning I requested Don Pullen and Siri correctly played Don Pullen. Ah, sweet relief. A sign that she is “learning”?

Another fail that seems like a learning process for Siri, the first time I requested REM Unplugged 1991/21: The Complete Sessions she failed because I didn’t have the full name. I just said REM Unplugged and she started playing a radio station for REM. When I said the album’s full name it worked. I went back a few hours later and just said REM Unplugged and it worked. Again, my hope is that she learns what it is I’m listening to so that in the future a long album name or a tricky artist name will not confuse her. Will see see how it plays out (literally!).

Yet another failure, and this one really surprised me. I’ve listened to the album “Living Room Songs” by Olafur Arnalds quite a bit. I requested Living Room Songs and she began playing the album Living Room by AJR. Never heard of it, never listened to it. So, that’s a BIG fail. There’s nothing difficult about understanding “Living Room Songs” which is an album in my “Heavy Rotation” list. That’s the worst kind of fail.

One last trouble spot worth mentioning. I have Hey Siri turned on on both my iPhone and Apple Watch. Most of the time the HomePod picks up but not always. On several occasions both the phone and watch have responded. I’ve gotten in the habit of keeping the phone face down but I shouldn’t have to remember to do that. I definitely see room for improvement on this.

I’ve requested the other usual things during the day with great success: the latest news, played the most recent episode of one of my regular podcasts, gotten the weather forecast, current temperature, sent a few texts, used various Homekit devices, checked the open hours of a local store and created a few reminders. It all worked the first time.

There were a couple of nice little surprises. In changing the volume, it’s possible to just request that it be “turned up a little bit” or “down a little bit”. I’m guessing that there is a good bit of that natural language knowledge built in and we only ever discover it by accident. Also, I discovered that when watching video on the AppleTV, if the audio is set to HomePod, Siri works for playback control so there’s no need for the Apple remote! This works very well. Not only can Siri pause playback but fast forward and rewind as well.

Audio Quality
Of course Apple has marketed HomePod first and foremost as a high quality speaker, a smart Siri speaker second. I agree with the general consensus that the audio quality is indeed superb. For music and as a sound system for my tv, I am very satisfied. My ears are not as well tuned as some so I don’t hear the details of the 3D “soundstage” that some have described. I subscribe to Apple Music so that’s all that matters to me and it works very well. Other services or third party podcast apps, can be played from a Mac or iOS device via AirPlay to HomePod. I also use Apple’s Podcast app (specifically for the Siri integration) so it’s not an issue for me.

Voice First: Tasks and Music
The idea of voice first computing has caught on among some in the tech community who are certain that it is the future. I certainly have doubts. Even assuming perfect hardware that always hears perfectly and parses natural language requests perfectly (we’re not there yet) I certainly have problems with the cognitive load of voice computing. I’ll allow that it might just be a question of retraining our minds for awhile. It’s probably also a process of figuring out which things are better suited for voice. Certain tasks are super easy and tend to work with Siri via whatever device. This is the list of usual things people are doing because they require very little thinking: setting timers, alarms, reminders, controlling devices, getting the weather.

But let’s talk about HomePod and Siri as a “musicologist” for a moment. An interesting thing about playing music, at least for me, is I often don’t know what it is I want to play. When I was a kid I had a crate of records and a box of cassette tapes. I could easily rattle off 10 to 20 of my current favorites. Overtime it changed and the list grew. But it was always a list I could easily remember. Enter iTunes and eventually Apple Music. My music library has grown by leaps and bounds. My old favorites are still there but they are now surrounded by a seemingly endless stream of possibility. In a very strange way, choosing music is now kind of difficult because it’s overwhelming. On the one hand I absolutely love discovering new music. I’m listening to music I never would have known of were it not for Apple Music. I’ve discovered I actually like certain kinds of jazz. I’m listening to an amazing variety of ambient and electronic music. Through playlists I’ve discovered all sorts of things. But if I don’t have a screen in front of me the chances of remembering much of it is nil. If I’m lucky I might remember the name of a playlist but even that is difficult as there are so many being offered up.

So while music on the HomePod sounds fantastic when it’s playing I often have these moments of “what next?” And in those moments my mind is often blank and I need a screen to see what’s possible. I’m really curious to know how other people who are using voice only music devices decide what they want to play next.

Conclusion
There isn’t one. This is the kind of device that I want to have. I’m glad I have it. I enjoy it immensely. It is a superb experience until it isn’t which is when I want to throw it out a window. Hey Apple, thanks?


  1. Well, sometimes a yell is actually required. ↩︎

Using an iPad to maintain websites – my workflow

A couple weeks ago I wrote about my website managment workflow changing up a bit due to Panic’s recent announcement that they were discontinuing Transmit. To summarize, yes, Transmit will continue to work for the time being and Panic has stated that it will continue developing Coda for iOS. But they’ve been slow to adopt new iOS features such as drag one drop while plenty of others are already offering that support. So, I’ve been checking out my options.

After two weeks with the new workflow on the iPad I can say this was a great decision and I no longer consider it tentative or experimental. This is going to stick and I’m pretty excited about it. I’ve moved Coda off my dock and into a folder. In it’s place are Textastic and FileBrowser. Not only is this going to work, it’s going to be much better than I expected. Here’s why.

iCloud Storage, FTP, Two Pane View
Textastic allows for my “local” file storage to be in iCloud. So, unlike Coda, my files are now synced between all devices. Next, Textastic’s built in ftp is excellent. And I get the two pane file browser I’ve gotten used to with Transmit and Coda. Local files on the left, server files on the right. The html editor is excellent and is, for the most part, more responsive than Coda. Also, and this is really nice as it saves me from extra tapping, uploading right from a standard share button within the edit window. Coda requires switching out of the edit window to upload changes.

Drag and Drop
Unlike Transmit and Coda, the developers of FileBrowser have implemented excellent drag and drop support. I’ve set-up ftp servers in FileBrowser and now it’s a simple action to select multiple files from practically anywhere and drag them right into my server. Or, just as easily, because I’ve got all of my website projects stored in iCloud I can drag and drop from anywhere right into the appropriate project folder in the Files app then use the ftp server in Textastic to upload. Either way works great. Coda/Transmit do not support drag and drop between apps and are a closed silo. The new workflow is now much more open and with less friction.

Image Display and Editing
One benefit of FileBrowser is the display of images. In the file view thumbnails on the remote server are nicely displayed. If I need to browse through a folder of images at a much larger view I can do that too as it has a full screen image display that allows for swiping through. Fantastic and not something offered by Transmit or Coda. Also, from a list view of either Files or FileBrowser, local or remote, I can easily drag and drop an image to import into Affinity Photo for editing. Or, from the list view, I can select the photo to share/copy to Affinity Photo (or any image editor).

Textastic and Files
This was another pleasant surprise. While I’ll often get into editing mode and just work from an app, in this case Textastic, every so often I might come at the task from another app. Say, for example, I’ve gotten a new images emailed from a client as happened today. I opened Files into split view with Mail. In two taps I had the project folder open in Files. A simple drag and drop and my images were in the folder they needed to go to. The client also had text in the body of the email for an update to one of his pages. I copied it then tapped the html file in Files which opened the file right up in Textastic. I made the change. Then uploaded the images and html files right from Textastic.

Problems?
Thus far I’ve encountered only one oddity with this new workflow and it has to do with this last point of editing Textastic files by selecting them from within the Files app. As far as I can tell, this is not creating a new copy or anything, it is editing the file in place within Textastic. But for any file I’ve accessed via Files it shows a slight variation in the recents file list within Textastic. Same file, but the app seems to be treating it as a different file and it shows up twice in the recent files list. Weird. It is just one file though and my changes are intact regardless of how I’m opening it. As a user it seems like a bug but it may just be “the way it works”.

Using HomeKit

Smart Plugs
Last spring I finally purchased my first smart plug, a Homekit compatible plug from KooGeek. It worked. I bought a second. A few weeks later the local Walmart had the isp6 HomeKit compatible plugs from iHome on sale. Only $15. I bought three. My plan was to use these with lights and to have one for my A/C in the summer to be swapped out to the heater in my well-house in the winter. I’m pretty stingy in my use of energy so in the winter I make it a point to keep that heater off and only turn it on when when I must which requires a good bit of effort on my part. I don’t mind the walking out to the well house as I can always use the steps but it’s the mental tracking of it and the occasional forgetting that is bothersome. Having a smart plug makes it convenient to power it on and off but I’m still having to remember to keep tabs.

Automations
Enter automations. The Home app gets better with each new version. By using automations it is now possible to automate a scene or a device or multiple devices at specific times or sunset/sunrise or a set time after sunset/sunrise or before. Very handy for a morning light but not too helpful for my well-house heater. But wait, I can also set-up an automation for a plug based on a Homekit sensor such as the iHome 5-in-1 Smart Monitor. I put the monitor in the well-house and create an automation to turn on the heater if the temperature dips to 32. I’ve turned my not-so-smart heater into a smarter one which will keep my water from freezing with no effort from myself. Even better, it will reduce my electricity use because of it’s accuracy.

I have a similar dumb heater in my tiny house as well as a window A/C. I might use the same monitor to more accurately control heating and cooling in here. Currently I do that with constant futzing with controls and looking at a simple analog thermometer. It would be an improvement to just have a set temperature to trigger devices.

Lights
I’ve been avoiding purchasing Homekit compatible lights because most, such as those from Phillips, also required purchase of a hub. Also, cost was a bit much. My reasoning being that if I just pick-up smart plugs as they are on sale I can use those for lights or anything else. Cheaper and more versatile. That said, one benefit of the lights is that they can be dimmed which is appealing. So, two weeks ago I picked up one of Sylvania’s Smart bulbs. It works perfectly. I’ll likely get another but in my tiny house I don’t need that many lights so two dimmable bulbs will likely be enough. It’s very nice to be able to ask Siri to set the lights at 40% or 20% or whatever. I have an automation that kicks on the light to 15% at my wake-up time. Very nice to wake up to a very low, soft light. With a simple request I can then ask Siri to raise the brightness when I’m actually ready to get out of bed.

Lighting Automations
An hour after sunrise I’ve got another set of LEDs that kick on for all of my houseplants that sit on two shelves by the windows. An hour after sunset those lights go off and at the same time the dimmable light comes on at 50%.

Home
All of the set-up happens via the built in iOS Home app. It’s a fairly easy to use app that gets better with each new version of iOS. My set-up is pretty simple but Home is designed to scale up with larger homes with more rooms and devices. In my case, I’ve go the Home app split screen with Apple Music on my iPad Air 2. It’s on a shelf within easy reach of my usual sitting spot on my futon/bed. While I do most interaction via Siri or automation it’s nice to have easy visual access. Especially handy for monitoring the well house heater and temperature. Having Music open and ready to play to a speaker via AirPlay is very nice.

AppleTV as Hub
Of course, to really make this work a hub is required. A recent iPad running iOS 10 or one of the newer AppleTVs will work. I’m using the AppleTV because I’ve always got one on. Set-up was easy and I’ve never had to futz with it. The nice thing about this set-up is that I can access my Homekit devices from anywhere. Whether I’m in town or visiting family or out for a walk, checking or changing devices is just a couple taps or request from Siri.

HomePod
Last is the device that has not arrived yet. My HomePod is set to arrive February 9. I don’t need it for any of this to work but I suspect it will be a nice addition. Controlling things with Hey Siri has always worked pretty well for me though I suspect it will be even better with HomePod. Will find out soon.