Sponsor Me!

Currently, I'm publishing sporadically (as in, there has been a span of 10 months between the last post and the current post). I'd like to write and publish more. Unfortunately, I'm a super busy person, especially since I work a 9 to 5 job five days a week. If you want to help me free up more time, so I can write and publish more, please buy me a coffee or sponsor me through recurring Patreon payments (so you don't forget!).

Buy Me a Coffee at ko-fi.com


Become a Patron!


Friday, September 09, 2011

Combining the Powers of iPad & Blackberry

For the last couple months when time has presented itself (seriously, I had no time control or management during July), I've worked on transcribing a handwritten sermon by George Ripley. He was a Unitarian minister at the Purchase Street Church in Boston during the 1830's.

The sermon is called On Common Sense in the Affairs of Religion. The content of the sermon doesn't matter for this essay.

Rather, I want to focus on how my iPad and Blackberry have combined into a useful set of tools. The specific brand and models of the mobile devices probably don't make a huge difference. A Playbook and iPhone may work just as well.

Actually. . .an iPhone may not work as well. I use the Blackberry's Memo app to type in the transcription. Any app that accepts text, saves it and will sync it up with other devices will work fine. Software doesn't cause the issue.

The hardware might. I've encountered people with strong opinions about touch screen keyboards versus hard material keyboards. The dynamism of iPhone screens compared to static unmoving keyboards play a big part. While transcribing Ripley's sermon, I rarely look at the Blackberry keyboard. My thumbs know where to find the right keys, and the keys stay in place.

I'm guessing you can lock the orientation of the iPhone screen and keyboard to keep it from flipping all around. Maybe that's not as big of an issue as I had originally thought. My preference probably has more to do with physically pushing the Blackberry keys.

A tech nerd friend of mine once told the story of Steve Jobs's genius behind getting the iPhone and iPad to click when pushing a key on the virtual keyboard. Apparently people wanted more than just visual feedback when they pushed a key. People wanted some other sensory feedback. The sound of a click made all the difference to people in marketing survey groups.

The feeling of separate keys on the little Blackberry keyboard, however, provides me orientation while typing but not looking. Through my many years of typing and taking a class or two, I've gotten the QWERTY keyboard mapped out in mind and body.

QWERTY keyboards can vary in size. The letters on it don't change position, though. A virtual keyboard can only provide me orientation while looking at it. I don't have to move my eyes around to keep track of my hands and fingers, but I have to gaze at the keyboard while typing to map the board according to key size.

I can feel my way around the letter keys on a Blackberry, though. Put my thumbs on the home keys or even just the space bar, and I know where all the other keys are. I often type on it without looking while I walk down the street or with my hands and Blackberry in a pocket. I can handle the keyboard because I can feel it.

It's just as easy to have my eyes on a monitor with text and type the text into the Blackberry by feel. In the meanwhile, the iPad's direct and intuitive interface keeps cognitive load at a minimum.

A big screened desktop or laptop provides an easy view, as long as I don’t need to change the size of what I’m viewing or change the brightness of the screen. Things get complicated from that point on.

For screen size or degree of zoom on the computer, commands vary by program. Even then, so many ways for making the changes. We have menu tabs at the top of the screen with options under them. Going this route half the time, you then have to choose a value of zoom, usually in the form of a percent.

Or maybe you prefer the icon menu underneath the tabs. You can make your cursor into an magnifying glass then choose where to zoom in. You can click as many times as you like to reach the level of zoom you want. But if you want to zoom out, you have to change the icon then click to go in reverse.

Then we have the keyboard “shortcuts.” Anybody remember the days of DOS when we only had the keyboard as our main input device? Typing ctrl-+ and ctrl-- has its elegance and instant gratification, but the computer decides where to zoom. You get the great challenge of getting to where you want to read or look at afterward.

It’s like a game, but who wants to play a game when reading a document? Especially when the author of the document has horrible, horrible handwriting?

With the iPad, though, you just use your index finger and middle finger. Place them close together and spread them apart or vice versa, wherever you want to zoom. You can also directly manipulate the screen view without using your keyboard or mouse as an intermediary. Your fingers have become a direct input interface to the iPad.

Adjusting the brightness on the desktop or laptop monitor becomes a hassle, too. The laptop at least provides the convenience of a keyboard shortcut. With the desktop, though, you must change paradigms switching from the mouse/keyboard combo to play with some knobs on the monitor. Then for both of them, you go back and forth, tinier and tinier movements until your reach the perfect illumination.

Our operating systems have in-system options for switching brightness, too. Instead of changing paradigms on the keyboard or from two completely different types of controls, you get to jump around from program to program.

With OSes these days, it doesn’t necessarily come off as inconvenient but something new gets to pop up onto the screen. Yay. . .more clutter.

Again, the iPad provides the immediate, direct experience. Adjusting the brightness requires nothing less than pushing the button on the left side of the screen (or whichever side you prefer), swiping the bar that pops up at the bottom in the direction your right then moving a virtual knob to the left and right until you reach the desired brightness.

Maybe the iPad requires just as many steps as the desktop or laptop. But it’s all right in front of you, on one flat piece of technology that you can hold with your two hands. Every step of the way you control it with the tips of your fingers.

The iPad doesn’t leave the user at the mercy of having to interact through two or three layers of interface: hand to keyboard/mouse, keyboard/mouse to cursor, cursor to end result. The most indirect interface on the iPad is when you’re typing. Even then, it’s all on the same screen.

But putting the cursor in the middle of a word can become a hassle. Making corrections can become a problem. Without the sense of touch and consistent spacing between keys, making mistypes becomes easy.

Now I’ve gone full circle. That’s why I’ve got the Blackberry to keep notes and the laptop to write documents. Every paradigm has its strong points and weak points. The laptop/desktop makes for good content creation. The iPad alone (or any tablet?) makes for good content consumption, especially online and out of home.

The iPad/Blackberry combo makes for an awesome transcribing machine: iPad for viewing and Blackberry for transcribing. Even better: they’re both super portable.

A neat feature about this essay: I wrote the first at the beginning of August; the second part at the beginning of September. The passion and excitement for the endeavor remains, even if I never finished the transcription and moved onto a different approach for my project. The method of transcription had nothing to do with the new angle. Maybe the bad handwriting had something to do with losing my interest.

The main thing motivating the change has to do with seeing life pass me by as much as it does. That’s a whole other story, though.

Links of Note: George Ripley, Unitarian, Purchase Street Church, All Things Dork



Buy Me a Coffee at ko-fi.com