Computers in general are too hard to use.
The largest problem with computers is that one typically has to learn a slew of jargon and conventions in order to do simple tasks. This is perhaps my biggest pet peeve with computers: too much technology gets thrown in a person's face. Whether it's a faulty driver, the name of a mail server, or a broken Back button in a web browser, the average person doesn't care about the technology. Should she?
I say no. Mind you, I know this stuff inside-out and practically breathe it. But can you explain to me why someone has to understand the driver model in order to use a damn digital camera? Why are there multiple layers of complexity when it comes to scanning a picture? Why is there so little control in so many operating systems (especially those created by Microsoft?)
I understand and acknowledge the need for ground rules. Historically, the Mac has been clearly superior in this aspect and arguably still is today. Windows has ground rules as well, but they're considerably more low-level (here's a drive, here's a directory, here's a program, good luck finding the button that does what you want it to do.) On the Mac side, important documents such as the Human Interface Guidelines ensure that everything "just works" the way it's supposed to.
I'm also all right with a learning curve if it isn't steep. The Windows learning curve is still far, far too steep. It's gotten better, definitely, but it still ultimately feels like a bunch of parts cobbled together to make something that usually works.
Now, said learning curve is going away for younger people. I fear that this isn't because computers are getting easier to use (particularly the dominant OS,) but rather that computers are becoming more ubiquitous. People are getting used to the idiosyncracies of Windows, and why that button does that but yet it doesn't do it over here. They're getting used to these concepts, instead of questioning them.
If you detect a tinge of the sheep mentality here, it's no accident. A lot of people do choose Windows - but many, many more do not.
I was honestly hoping not to turn this into a giant Windows slam, but the more I hear about difficulties using Windows machines, the more I appreciate everything about OS X (and Macs in general.) Tasks that should be simple, such as emailing photos, are simple. Tasks that should be complex (such as changing user permissions to ensure that no one outside of a defined group can access files) are complex. In other words, it's the closest we have to the ideal OS.
Is it perfect? Hardly. But that's where my interest comes in strongly. I'm becoming quite interested in the ol' HCI field: Human Computer Interaction. I find the ways that people interact with computers intriguing, in part because it gives us the opportunity to improve what may be seriously faulty.
Computers are faulty. Still. There's a lot that needs to be done to computers in order to make them invisible, and ideal. The perfect computer is one that is invisible. It is a tool that helps you get things done, or helps you communicate, or helps you be creative. It is not a tool that interferes with this process, questions what you want to do, or holds your hand too much - unless, of course, you want a lot of hand-holding.
The invisible computer is what I want to help create.