Tuesday, October 30, 2012

New Computer Users

I can't remember when, or why exactly, I started using a desktop computer. The fact that it was a clunky desktop system running Windows 93 says something all on its own. Clearly, I didn't have any idea how irrelevant that system would become in short order. New computer users don't understand how quickly technology development moves — if I had known that, I probably wouldn't have bothered. The fact of the matter is that when I first started using a computer, I wasn't a developer, I wasn't using the thing as a tool to do any meaningful work. The real driving force was simple curiosity. That curiosity, I'm sure, was partly driven by the surrounding hype at the time. Computers, and their interconnected fabric were the future. If you didn't understand them and tightly integrate them into every aspect of your daily life, you'd left behind. Back then, I don't think the surrounding hype got the best of me. Hype hasn't slowed down, its simply moved on — mobile computing for instance. I was too busy doing kid stuff to really appreciate the ramifications, and so I was considered a casual user. Probably like a lot of people around the globe who use computers, but don't depend on them for critical tasks. Then there are other types of users. There are developers, there are non-humans. We can classify users based on why they're currently using a computer, not why they first started.

When I first started using a computer, my perceived notions about the technology were hardware-centric. My interests were based on what I could see — it's the machine itself that makes things happen. Which makes sense from an educational standpoint. As you're making your way through school, particularly at the younger end, physical objects are the best illustrators of concepts. This is a triangle, the same as the one you're seeing in the book. This is a toy firetruck, the same as the one driving by out on the street. It's much easier to grasp new concepts when they're presented visually, and in my case, the computer itself was responsible for — whatever it was I thought computers did. From there, you start interacting with objects. It's one thing to see an object in the real world and draw an analogy to a concept formed in your mind. It's a more powerful thing to take a toy broom and start using it the same way the teacher does. The user interface on any computer system is the first thing any new computer user will manipulate if given the chance. This first interaction could make or break that person's lifelong relationship with computer technology. If you have a bad experience, like the first time you play a sport and end up injuring yourself — you're impression of that experience taints any future variations of it. And so an interface that allows the user to perform actions on it is better than one with limited capabilities. But it goes further than that to include instruction. Whether the interface the first time user is being presented with has many actions, or is restrictive, they need to be discoverable. They need to lead to more inquiries from the user. Well, I can do that, can I do this? That doesn't work, did I do it wrong? Encouragement of deep probing can really suck people in. But not everyone.

If it were true that all was needed to create a successful user interface were instructional simplicity, there wouldn't be nearly as much software in existence as there is today. If you think about it, software, ultimately, is designed for some actor to solve a particular problem they're having. But, if first time users could all pick up how to perform queries, how to issue commands, and how to extend and improve the system at first glance, we wouldn't need all this specialized software that strives to make things easier for a particular class of user. From another point of view, new computer users are embryonic, and once they're exposed to their new computer environment, they develop new skills, attitudes, and ideas that fall outside the realm of computing. Computer use has a cumulative effect over time, and over this time, the knowledge you're building up helps determine your user class. After first starting with a computer, it doesn't take long to figure out that it was a loathsome experience or that it was an important tool and you cannot figure out how you lived without it. But form this point onward, you're no longer a new computer user, you're a software user.

Once you find yourself fitting into a general class of software user, you tend to evolve in a certain direction. Over time, you develop a personal taste for what makes a good application. The use cases you're playing into also determines the trajectory of your software choices. You make some bad calls, your direction changes slightly, toward a different set of software that does mostly the same thing, but maybe it does once small aspect of it better. Or maybe you've found some application that works exactly how you see fit. Maybe this confirms some doubts you've had as a user in your chosen problem domain and you've now further solidified the path you're on. What's interesting is how all this happens. There is such a diverse set of usage patterns out in the software ecosystem that it's very difficult to measure patterns. I'm not talking about the hoards of Mac and Windows users — I'm talking about the complex configurations that seem to emerge from a new user of the software. Chances are, there is a nearly identical user, one how follows the same daily routine and who is trying to solve the exact same problems. But they accomplish their goals using an entirely different suite of software than someone who has set out to do the same thing. How can there possibly be so many ways to solve the same problem? Or perhaps its not as much about the finish line as it is about taste. If you think about using computers for entertainment value — movies, games, and so on — there is a completely different set of rationales. Here, as users, we're not thinking about how to stream content so that it has a minimal impact on others streaming the same content. That's a development perspective of someone working at the media company delivering the content. The user looking for entertainment value is looking at the content. But also the controls that allow them to manipulate the content — changing the channel, setting preferences and bookmarks.

If I were to sit down with one of my friends and compare preferences for media on our computers, there are bound to be some similarities, but also some important contrasts. The funny thing is, I could probably track backward through time and see how our preferences diverged. It once again comes down to user class. Or in this case, each of our multiple user classes. I don't think each user around the globe can safely fit in one class of user anymore. I think most of us are wearing multiple hats. Especially if you're making a living working with computers professionally. Here, you're likely to have multiple classes applied to your user profile due to your separation of business life and personal entertainment. And, depending on what you do professionally, your profile may require more classes still to fully capture the complexity of your experience. All this leads me to believe that there is a feedback mechanism in play here. Developing a class definition for a user and applying it to someone isn't as clear cut as saying that they've used this particular type of software for so long and therefore fall under this tree. I think it's a combination of choices they've made early on as a user, what led them to make those choices, and once they've established themselves with a base user class, the process repeats itself. Only as life with computers goes one, the less significant your decisions become. At this point, the choices the user makes are an exercise in fine-tuning.

Early in a person's life with computers, they're exposed to the notion of free software, or, open source software. It isn't likely that they'll make the distinction between free and open source just yet, but that's not important. What matters is that fact that some users make the mental leap to understand that there is software available for use, free of charge, that can likely solve any of your problems. Of course, every individual use case is unique and it isn't fair to say that paying for software is never the better option — because a lot of the time it is. What is essential, is that users understand the distinction early on in life. It's important to understand where open source is the better choice and how to go about using it. It'll set the cost expectations not only for yourself, but for newer generations that you help along the way. Given that many open source projects are fairly mainstream — you can use Linux for 100% entertainment value — it's more accessible these days. And if it's more accessible, the more likely users of open source technology are likely to understand the motivations of the developers that created it. And that's a valuable lesson to someone new with software.

So if understanding the full spectrum of software — both paid proprietary and free open source — is there a bad thing that we should avoid in early computer usage? The only thing I can think of is never take anything as a given. Especially if you're teaching kids to use computers. I think once you've established a trustful attitude toward software you've never used before, you set the bar low in terms of what you can get out of the software. Learn how to poke holes in the walls and see if the code starts pouring out. I think having this ability will help you detach software quality from branding and ultimately lead more informed life with computers.

No comments :

Post a Comment