Kasra Kyanzadeh

My internship at Khan Academy

I spent the past summer working on the iOS team at Khan Academy.

Khan Academy is best known for their videos, but there are also hundreds of interactive math exercises that students can use to practice skills ranging from counting to calculus. This is what an exercise looks like on the website:

For the past few months, the mobile team has been working hard to bring these exercises to the iPad app. Here are some of the parts I helped build...

Related videos

Most exercises have related videos that students can watch if they're stuck. My first project was to implement a native version of the related videos interface. We liked how the YouTube app lets you drag down a video to interact with other content, so that's what I started with.

In the first prototype, the video stayed on a fixed track, using the finger's y-coordinate to interpolate the position on the track. This felt awkward on the iPad: if you didn't drag along the path, your finger's movement would become disconnected from the movement of the video.

We also realized that always having a thumbnail on the screen would be a waste of precious pixels, especially since a lot of the thumbnails would be black rectangles with a few scribbles.

So we played with having three states: hidden, maximized, and docked. The docked video would play the video in the corner, letting the student interact with the exercise while watching the video.

Since exercises can have multiple related videos, this version let the student swipe left and right to switch between the different videos in the maximized state.

This prototype also allowed the video to be docked anywhere along the bottom of the screen. As we added more UI elements to the screen, we ended up pinning the docked video to the bottom right to make it more consistent.

I spent a lot of time playing with these prototypes. I noticed that if your finger wasn't centered in the video view when docking the video, the finger would become detached from the video as it shrunk, ruining the illusion of direct manipulation. I fixed this by setting the view's anchor point to the touch position, which makes the view shrink around the user's finger—a simple change that prevents limb-view detachment. See for yourself:

Before: Anchor point is always in the center of the video view. The touch leaves the view as it shrinks.
After: Anchor point is set to the touch position. The view resizes around the touch, so the touch never leaves the view.

Plot twist! Most of this didn't make it into the app. When we tested with students, it wasn't clear that the fancy interactions were helpful. So we ended up shipping a simpler implementation without docking.

Scratchpad

My next project was to make a native version of the exercise scratchpad. A lot of students use the scratchpad on the website to do rough work, even though writing with a mouse can be painful, so we knew it was a valuable feature, and that we could provide a far better experience on a touch device. This ended up being my biggest project.

It was decided early on during the development of the app that exercise content would be rendered in a web view—it was the only feasible way to bring hundreds of exercises with the resources we had. This meant the scratchpad canvas (a native view) would cover the webview rendering the exercise.

This arrangement was complicated by the fact that all exercises have interactive regions. For example, in the exercise below, the blocks have to be dragged to put them in the correct order.

So the main problem was hit testing: when a touch starts, should we start drawing on the scratchpad or let the webview handle the touch?

After an initially-promising but ultimately sketchy solution that relied on internal implementation details of how native touches are handled when Javascript calls e.preventDefault() on the touch event, it became clear that the right way to do it would be to ask the webpage directly.

So we marked all the interactive DOM elements with a special class, and added an endpoint to the Javascript API that tells the caller if there is an interactive element at the given (x, y) coordinate on the page.

From the native side, when a touch begins on the scratchpad canvas, the app calls the Javascript API to see if the touch began on an interactive element. At first, I did a simple synchronous call with stringByEvaluatingJavaScriptFromString:, but this didn't feel very good, especially on older iPads. The blocking call would drop the first few points of the touch, which was particularly annoying for short strokes, like decimal points.

So I ended up making an async call into Javascript and queueing up the touch points without rendering them. Then, depending on the response, the app either renders the queued points or throws them out and lets the webview handle the touch.

As a nice side-effect, this async API is forward compatible with WKWebKit (more on that in a bit).

There were a few other interesting problems to solve with the scratchpad that I might write about later. Big thanks to Andy for improving inking and a beautiful refactor, and Matt for helping with two-finger scrolling.

Other things…

I got to work on a bunch of smaller things as well. Here are some of them.

I had a lot fun working on animations in the app. This is a bouncy animation I made to show the student's progress in an exercise. The shipped version has confetti, so go download the app to experience the full glory. Thanks to fellow intern Elizabeth Lin for helping design this and many of the other things you saw in this post!

Interaction Prototyping

I took a break from iOS land during the 4-day Healthy Hackathon and teamed up with Andy and Ben to work on a proof-of-concept for a UI prototyping tool. Having written way too much code to explore different interactions for the related videos, we wanted to explore how we could make iterating on interactive designs faster.

We built a visual editor that emits framer.js code. Framer.js is a Javascript library that helps build interactive prototypes, but it can take a lot of fiddly code to get things where you want them.

Our editor lets you create and place elements visually to set the scene, and only write code to describe the interactions. This hybrid approach saves a lot of boilerplate code.

UIWebView → WKWebView

After WKWebView was announced, I spent some time trying to make the app use it on iOS 8+ devices, while continuing to use UIWebView on iOS 7 devices.

This involved converting all the communication between the app and webview to be asynchronous, since WKWebView's API for calling Javascript functions from Objective-C is non-blocking.

But when I actually ran the app on an iOS 8 device, nothing loaded. That's because WKWebView can't load files directly from the local filesystem (this is probably going to be fixed). Just for fun, I hacked up a version of the app with a local webserver to serve the local files. In brief unscientific testing (i.e. I just played around with some exercises), I didn't notice huge performance improvements.

Launch to exercise

I added a way to go directly to a specific exercise when launching the app from Xcode. A simple change, but one that saved a lot of time, especially when debugging issues for a specific exercise.

It uses a custom scheme that sets an environment variable with the name of the exercise to load on launch.

The value is read in application:didFinishLaunchingWithOptions:

#if DEBUG
NSDictionary *env = [[NSProcessInfo processInfo] environment];
NSString *initialExerciseName = env[@"initialExerciseName"];
if (initialExerciseName) {
    [self loadExerciseWithName:initialExerciseName];
}
#endif

Thank you!

Learning in progress.

I got to work on really fun problems with really great people, which are the two secret ingredients for a great internship. Thanks to everyone at Khan Academy, especially the iOS team for teaching me a ton: Mike Parker, Andy Matuschak, Laura Savino, and Marcos Ojeda.

And thank you for reading this. If you liked this post, follow me on Twitter!

P.S. If you want other interns' experiences, here are a bunch of posts:

Thanks to Leila and Nick for reading drafts of this post.
Published January 20, 2015.