On Stickers

As I seek to understand more about the popularity of stickers in messaging apps (hint: they’re more than just big emoji!), I thought I’d share some of the interesting articles I’ve come across.

Sticker Culture

  • Stickers: From Japanese craze to global mobile messaging phenomenon by Jon Russell (TNW)

    Despite success in Asia, it appears likely that the appeal of stickers is different in Western markets, where Romanic alphabets are better supported on smartphones and there is less of an emoji/cartoon culture.

  • Why is every messaging app under the sun trying to sell stickers to you? by Jon Russell (TNW)

    Stickers are a frictionless way to monetize a service. By that I mean that they do not immediately disrupt the user experience by serving adverts, forcing video plays or using other forced ‘interactions’ that might serve to draw revenue from sponsors. Stickers are not intrusive and can keep an app priced free.

  • The Elements of Stickers by Connie Chan (Andreessen Horowitz)

    The “trading” element, however, is less about statically collecting and more about dynamically custom-curating one’s personal collection of stickers. These collections also signal one’s “sticker street cred” in Asian messaging apps — you can always tell a newbie or non-tech savvy user by their use of the stock stickers only.

For Developers

For Users

Key Takeaways

  • Designers only need to submit @3x versions of stickers (max size: 618 x 618px, 500KB)
  • PNG files are preferred (even for animated stickers)
  • Pay attention to transparency because your stickers can overlap message bubbles and images in conversations 
  • If you are making stickers that feature a single character, name the sticker pack after the character (or “CharacterName & Friends”)
  • If you want to appeal to Asian users, a quick Google image search of the word “kawaii” wouldn’t hurt
  • Most sticker packs seem to have at least a dozen stickers

Even though I’m not the greatest artist, I’m hoping to have a sticker pack ready for September!

Voicemail Transcription in iOS 10 Beta

I don’t answer my phone much these days. It seems I’m always either holding my sleeping baby, dealing with some sort of unfortunate situation involving his clothes, or creeping around the house in ninja-like silence while he snoozes in his bassinet. There’s no room for noise in my life right now—not when my chances of getting a decent night’s sleep are on the line!

As such, one iOS 10 feature that I haven’t heard much about but that I’ve found very useful is voicemail transcription. For instance, I missed two calls from a friend today. Thanks to voicemail transcription, I found out that the first call was urgent: something had gone wrong in the online class that I helped him set up and he needed me to take a look at it.

The second call came just as Charlie was settling down for his all-important afternoon nap in my arms. This time, I saw that my friend just wanted to tell me a story and that I could call him back when I got a chance.

In both cases, glancing at the transcription was way more convenient than holding the phone to my ear and potentially waking Charlie, who I’m convinced could hear a butterfly flap its wings in Africa.

Another great thing about this feature is that it gives you the ability to provide quick and easy feedback on the quality of the transcription. Below the transcribed paragraph it says something like “How useful was this?” and you can select either “Useful” or “Not useful.” Dead simple, right? I’m sure that prompt will go away when the final version is released but for now I’m grateful for its existence.

It made me wish that Apple would add that same kind of feedback mechanism to all of its AI “suggestions,” even if only in the beta releases. Whether it be suggested apps, contacts, calendar items or locations, I should be given the opportunity to report on their usefulness/relevance. Otherwise, how does Apple get better at this? How do they know where they need to improve? Heck, how do they even know what they’re doing well?

Quick unobtrusive feedback prompts are a great “opt in” way of figuring out the answers to those questions.

Thoughts on Screen Time for Kids

In episode 176 of ATP, Casey, John and Marco discussed their thoughts on “screen time” for kids and whether or not parents should limit the amount of time their kids spend in front of screens of any kind. It caused me to reminisce about my own childhood as well as my [few] experiences with babies and screens, so I thought I’d share those memories here. (They might be kinda boring, so the tl;dr version is: I think screens are A-OK!)

A Childhood of Screens

You might know me as the stay-at-home mom who codes on the farm, but I actually grew up in Bristol, Connecticut. I have very fond memories of our house there; most of them involve me playing outside in our gigantic backyard: finding salamanders under rocks, riding around in my little motorized Jeep, trying to start fires with a magnifying glass…you know, kid stuff.

However, my mom worked a late shift so there was a period of time in the afternoon before my dad got home that I spent with a babysitter—a lovely, middle-aged French Canadian woman named Leona. “Ona” and I watched a LOT of television. I always joke that I learned how to spell by watching Wheel of Fortune. We watched the usual slew of early evening sitcoms (the one I remember most clearly is Roseanne) and I watched my favorite movie, Homeward Bound, at least a million times.

When I was 7, my family moved to Ohio and several things happened: 1) I started at a new school in the middle of second grade, 2) My parents bought me a TV for my room, and 3) I got a Super Nintendo. I had a hard time making friends in Ohio, so I spent lots of time playing Donkey Kong Country, pouring over Nintendo Power magazines, and watching TV. Every night I’d fall asleep watching Nick at Nite, through which I became familiar with many of the shows of my parents’ time: I Love Lucy, The Jeffersons, Mary Tyler Moore, All in the Family, BewitchedHappy Days, and more. In many ways, I think those shows helped me understand how adults related to one another, as well as develop a sense of humor and empathy.

Still, I spent plenty of time not looking at screens. Ohio used to be underwater once upon a time, and my parents and I would visit parks where you could dig for fossils of sea creatures. I also became interested in model trains, so we’d visit train museums and displays around the state.

Two years later, when I was 9, we moved to Nebraska. I spent hours on the family computer, playing with virtual pets, learning HTML, and discovering a vast world of pirated content. I’m pretty sure I played Pokémon on an emulator before my mom bought me my first Game Boy. Like many kids, I was so addicted to Pokémon that I’d sit in the back seat of the car after dark, struggling to play the game by the light of passing streetlights.

Let me tell ya, the late 80s/early 90s were a weird time to grow up because everything was changing so fast and nobody knew what they were doing. As a kid, I sort of straddled the divide between pre-Internet and always-connected—between one screen (the TV) and ubiquitous screens. Since computers and gaming systems were new, and cool, and fun, my parents didn’t think twice about letting me play with them. And now, after all those hours of unrestricted screen time, here I am: a relatively well-functioning human being.

Screens + Babies

Charlie's selfie

Charlie lined this shot up himself!

Charlie is three and a half months old now. He’s very interested in our phones and likes looking at himself via the front-facing camera. There’s a period of time during the day when he refuses to sleep, but is also too cranky/sleepy to play with anything. During that time, he sits on my husband’s lap and watches Fast and Loud, a show about restoring old cars. It’s all just a bunch of blurry blobs to him, yet he’s fascinated by the movement and the bright colors of the hot rods. When the episode is over, he’s usually ready to eat and finally take a nap.

There’s a little baby girl at our church that I watched a few times in the church nursery. She was too shy to interact with the other kids and so I just held her on my lap the whole time while she watched them play. Suddenly she noticed my Apple watch and was transfixed by the honeycomb screen. At a little over a year old, she figured out that if she moved her finger over the surface of the watch, the little app icons would move. That interaction paradigm of touching a screen is incredibly easy for babies to get the hang of. It opens up a world of learning to them that can serve as a good supplement to those all-important hands-on activities.

The Future

I’m not worried about managing screen time with Charlie. In the same way my generation remembers cassettes, record players, rotary phones, and finding the answers to our questions at the public library, our kids may remember smartphones and tablets and 5K displays. In other words, the children who grew up with nothing but screens may very well be the ones who lead us into a future without them (or with fewer of them).

Our kids may be the ones who bring augmented reality to the mainstream. They may laugh at the thought of us staring at our phones all day. They may very well straddle a new divide: between ubiquitous flat pieces of glass and…well, whatever’s next. Heck, in some ways, that screen time might be essential in helping them figure out what should be next.

Sherlocked?

Well my friends, WWDC has come and gone and I, like many of you, am now deeply engrossed in the plethora of new videos, sample projects, and API diffs that Apple has posted.

Whether you were actually there or experienced the fun from the comfort of your home, you may have noticed one fateful phrase wedged amongst the many words on the Developer APIs slide: “Live Photo Editing.” And if you didn’t see it there, you may have read about it on the “What’s New in Photos” pop-up in the iOS 10 beta:

What's New in Photos

Screenshot by Casey Liss

So yeah, with iOS 10 you can now rotate, crop, and otherwise adjust Live Photos right in the Photos app—which is awesome and just as it should be!

I hesitate to say that I was sherlocked (which, incidentally, keeps auto-correcting to “shellacked” 😂). In order for an app to be sherlocked, I think there has to be some uncertainty involved. In other words, it wasn’t inevitable that Apple would build f.lux-like capabilities into iOS. Nor was it inevitable that Maps would gain the ability to locate your parked car, or that Photos would auto-generate compilations and call them Memories (there is an app by that name with similar functionality). However, I do believe it was inevitable that Apple would expand Live Photo-editing capabilities…the question was just “when?”

Now we know the answer.

And that’s OK. I learned so much building LiveRotate, and even sold a few copies! From its release on June 7 to today (June 28), LiveRotate was downloaded 304 times. A few people requested refunds, which was expected. I think the app can still provide value to the general public this summer, though when September rolls around I’ll likely remove it from sale.

Overall, I’m very happy with how well it sold, and am feeling more confident than ever about my ability to build apps!

LiveRotate stats

So what’s next for me? Well, I have two ideas for Messages apps: one sticker pack (depending on my drawing abilities) and one app that lets users build their own stickers. I’m also in the process of updating my Bible verse app for watchOS 3. After that, it’s back to Corgi Corral and then onward to some other app ideas that are floating around in my noggin (wow, does anyone use that word anymore?).

Best of luck to all of you with your summer projects! And for those tinkering with the idea of making an app: there’s no better time to get started! 😄

The Making of LiveRotate

I thought it might benefit other beginners if I wrote up an overview of how I went about building LiveRotate. (Spoiler alert: there was a lot of Googling involved!)

Starting the Project

When I began, I didn’t have the foggiest idea how PhotoKit worked, and I had all but forgotten how to use collection views, which help you display things in a grid. So, I turned to Apple to see if they had a sample project for the Photos framework and luckily, they do. It has even been updated to “illustrate the use of LivePhoto APIs.” Right on! 👍

I then translated almost the entire thing, line by line, into Swift. I’m not joking. I needed the code for the collection view, for displaying a Live Photo with a badge, and for caching thumbnails as you scroll, and that was honestly the bulk of the project (if anybody needs any of that code in Swift, just let me know!). As I translated the code, I learned what each piece did, so that I wouldn’t just be blindly copying things without building up my understanding.

Handling Rotation

Deciding how to rotate the photos was confusing at first because there are two ways you can do it. There are rotation flags that determine how a photo is displayed on a device (but that flag may not be respected by all programs/devices). Or, I could “physically” rotate the bits using some kind of transform. Option B seemed like the right way to go, so I set about learning two new frameworks: Core Graphics for the JPEG part of the Live Photo and AVFoundation for the Quicktime Movie part.

Rotating Photos

There are three types of image-related classes in iOS: UIImage, CGImage, and CIImage. For a beginner, that was SUPER CONFUSING (and still sort of is). Some more searching led me to a category for rotating CIImages by 90 degrees. The Swift equivalent of an Objective C category is an extension. So, I translated that code as follows:

Here’s an overview of the photo rotation steps:

  1. Request the photo data using PHAssetResourceManager
  2. Create a CIImage from the data and use the extension to rotate it
  3. Add appropriate metadata (more on this later), convert the resulting image to a JPEG and save it to a temporary location

Rotating Videos

Rotating the video portion of the Live Photo turned out to be much, much trickier. This Technical Q&A from Apple describes which methods actually rotate the buffers and which only set a rotation flag. In order to rotate the video, I needed to use an AVExportSession and apply a transform.

There are 4 orientations that a photo or video may be captured in. I made this convenience method to take the video’s original transform and return information about it.

Each of those 4 orientations could then be potentially rotated 3 different ways: 90 degrees, -90 degrees, and 180 degrees. When you rotate the video, you rotate it around its origin point, which can potentially move the video out of the frame. Therefore you have to apply a translation to get it back to where it’s supposed to be. Derek Lucas (@derekplucas) got me started by creating a Playground that rotated videos on the Mac. I took his translation values and had to tweak them, via trial and error, to get it to work on iOS. Here’s just a small sample of what that hot mess looks like:

Once rotated, I saved the video to a temporary file.

Live Photo Metadata

You can’t just throw any two photos and videos together and make a Live Photo without doing a little extra work. I found this project by genadyo on GitHub that shows what sort of metadata must be written into the photo and video files in order for them to be paired up correctly.

Basically, you have to do 5 things:

  1. Create an identifier of some kind, assign it to the key kFigAppleMakerNote_AssetIdentifier (which is “17”) in a new dictionary and set that dictionary as the kCGImagePropertyMakerAppleDictionary for your JPEG file.
  2. Create an AVMetaDataItem where the key is “com.apple.quicktime.content.identifier” and the value is the identifier you created in the first step.
  3. Create an AVMetaDataItem where the key is “com.apple.quicktime.still-image-time” and the value is 0. For some reason, this is required in order for iOS to recognize it as a true Live Photo.
  4. Use AVAssetWriter to re-save the video you made using AVExportSession, this time writing in the appropriate metadata. Of course, if you aren’t rotating the video, you could just use AVAssetWriter from start to finish.
  5. Save both the photo and the video to Photos like so (where “fileURLs” is an array containing the two temporary URLs for the photo and video):

Conclusion

I started LiveRotate on April 27 and finished it on June 6, so it took just a little over a month to make. I’ve had some good suggestions for improvements to the app and hope to implement those soon. For now, though, my brain can finally break free from “obsessive coding” mode and focus on important things like catching up on household chores and cooking some real food! 😉
Edit: 4:35 pm CDT

I forgot to add that I created the app’s icon in Photoshop CS6, and translated it into German, Spanish, Italian and Russian via a hilarious process of changing my phone’s language, opening up apps that had the words/phrases I needed, and screenshotting them. I know—I’m a dang thief!