10 outdated elements of desktop operating systems



We've come so very far in the way computer operating systems treat us, and in the way we treat those computer operating systems. They multitask, they animate, they reach into the internet and pull down our favorite parts, they rarely crash and they're always on. It's a far cry from a decade ago, but I think we could go so much further. The advent of the cheap, ubiquitous touchscreen, always-available internet and continually cheaper and more powerful hardware has revolutionized the phone industry, and I think it can also help the desktops and laptops we know and love do more for us. But a laptop isn't a phone: we're supposed to get a lot done on it, under some unrealistic deadlines, and some random company with big ideas can't come along and reinvent the desktop OS in one fell swoop -- that simply isn't practical when we have things to do.

So what's an OS to do? I think there are serious opportunities for evolution available to the Microsofts, Apples and Ubuntus of the world, but they involve embracing new technologies in new ways. And stealing a ton of ideas from phones. A finger on a screen is not a mouse on a pad, an internet browser is not the end-all be-all of the internet, and playing Crysis in a quad HD resolution at 60 fps is not the ultimate expression of gaming for 95% of the population. Join me as I explore a few bits of legacy cruft that need to be addressed before the desktop OS can become as important to this decade as it was to the last one.


1. Windows management

Problem: Spending time hunting for this text editing document under a dozen other windows.

This is at the top of the list, because it's probably my most frequent frustration; I'm always looking for the right window. Sure, you might tell me I can use "Spaces" or command / ctrl+tab or some other wild method of shuffling between my windows, but if tools like that exist to help you shuffle through the clutter, there's probably a deeper problem here. Everybody (my mom) always says that the best way to keep a room clean is to have places for everything and never let it get messy. I can't even count on one hand the features introduced in Windows and Mac OS this decade to help me "manage" my windows, but what if I never wanted to sign up for that job? An operating system is about performing tasks, not juggling. A touch of ADD and what might seem like a logical, modern operating system to some just ties my poor brain in knots looking for what I'm doing or what I'm doing next.

Solution: webOS




A theoretical computer that has a touchscreen just for kicks has room for adding more intuitive gestures into the workflow. Instead of trying to remember a key command or a four finger gesture, a bit of on-screen multitouch could probably rearrange those windows in a jiffy. Some of the Windows 7 snap-to functions are also very intelligent and could aid in this task. A single swipe in webOS puts me in a "card" view that is the overarching metaphor for navigating through the OS, not something tagged on to make it merely livable.

The other thing I would pull from webOS and other phone operating systems is the idea of pushing an app completely off the desktop and out of mind, while allowing it to run a background process to pull in its relevant push information or perform whatever other duty it does (a minimized window or app still remains in the task switcher or in the task bar, and therefore in the way). A nice little touchscreen flick (or maybe a pinch and flick, go wild!), could tell my computer that I don't want to see that entire application anymore -- while staying safe in knowing that Growl will pick up anything I'm missing by not having that window poking through 1/32nd of my screen.

Of course, we still have to multitask, since this is a desktop operating system. That's where things get tough, but I still think there's a way. Take that speed dial view in Chrome and Safari, for instance: it's a natural interface that's self-adjusting to my use of the browser and providing shortcuts in a relevant and organized manner. In the case of webOS, a larger screen could possibly allow for a two-up card view, where you pick two cards to co-exist. For the most part, if I'm doing actively doing more than two operations at once, I'm not really getting anything accomplished. I'd much rather drill through tasks and then send them away than see how many items I can manage to allow to coexist on my desktop before I lose my sanity. I'm frequently afflicted with an overabundance of tabs in my browser, but at least it presents me one at a time. I can read that page or bookmark it, and then I close it and then move on to the next tab. I'd never want to keep a window for Google or Facebook open at all times "just in case" a need for it arises.

Perhaps we've gotten too lazy with our implementation of drag and drop? If I could drag to a virtual target, such as the Start Menu or Spotlight, and then just start typing my desired target (iPhoto, Gmail, Tweetie, Windows Movie Maker, Facebook), which would subsequently launch in a way to deal with what I'm dragging, it would reduce steps and clutter. I want to execute tasks, not operate apps. Something our command line Unix friends know well. Quicksilver users, too.


2. Inappropriate use of touch

Problem: fingers aren't good at "clicking."

So, while we're on the subject of touch, let's dive in a little deeper. I think touch implementation is one of the biggest missed opportunities of Windows 7, and while Apple has the luxury of not selling any computers with touchscreens, and therefore not having to "worry" about it, that doesn't let it off the hook in my book -- the technology is clearly cheap and available.

And it should be used! We're all completely familiar with touching things in the real world, so it makes sense that we'd touch them in a virtual one. But five fingers are not a mouse cursor. I wonder what percentage of my day I use my fingers to point at something. One percent? Two? And yet I'm expected to point, tap and flick my heart out to get anything done on Windows 7 with touch, in a base imitation of the mouse.

Solution: webOS, Project Natal




Pinch to zoom is just one tiny little expression of what fingers can do. Fingers can grab, snap, smudge, toss and even kick paper footballs through your buddy's improvised uprights. Now, I can't pretend to know how all these things should be implemented in the desktop, but I think they can be made use of in really interesting ways if they're worked in at the OS level instead of consigned to some lame imitation of existing mouse and keyboard commands. After all, the mouse and keyboard already work great, why would I want to expend more energy just to fake what they can do so well? I'd rather use my hands in new ways to do new things.

Like I mentioned previously, webOS makes great use of the flick gesture (also the horizontal swipe), but I think there's also an opportunity for hand gestures away from the touchscreen. After all, on the desktop it hardly even makes sense for you to hold your arms all the way up to your screen to make something happen. Microsoft has shown off concepts (pictured above) of gestures in a 3D space in front of the computer which seem promising, and is doing similar things with Natal. If I'm just going to wave my arm to emulate a d-pad then I'm not interested, but if I can wave my arm and make something magic happen, then let's talk.


3. Lack of integration with browser, websites and webservices

Problem: Web apps make me find them, and that's too much work.

I'm gonna go a little crazy here and say that we're not necessarily going to leave the desktop behind and dive into some glorious world of just needing a web browser. In fact, when I think about some of my favorite applications, they're the exact reversal of that trend. Mailplane puts my Gmail in a familiar, functional desktop context, Tweetie makes managing Twitter a casual, persistent activity instead of a chore, and iPhoto manages my social photo sharing folders for me so I don't have to stare at endless web forms and upload dialogues to get stuff on Flickr.

Phones know this trend well (thanks in part to their sluggish web browsers), and with the advent of the iPhone and Android we've seen an endless stream of apps that pull data from the cloud and present it in friendly, powerful interfaces. Why would I want to go to the cloud in the browser when I can have the best elements of the internet in my comfy desktop apps?

Solution: iPhone




It's not hard to imagine how web applications should act: they should act exactly like desktop applications. You should be able to use them online or offline, drag and drop files between them, and have a universal set of key commands that make everything more badass. But at the some time, desktop applications need to wise up and learn how to act more like web apps. Storage in the cloud should be a default, not a special feature, updates should be continual and granular (we're already close on this front), and the splicing in of web streams of information should be leveraged as often as possible. Most of this is already the case in the best-of-breed apps, and Apple and Microsoft have already spearheaded some initiatives on this front, but there's still way too much of my operating system that seems hardly aware of the internet, and that's just so early 2000s.


4. Power management, graphical hardware management

Problem: I don't know if I'm going to get two hours or five hours off of this charge, or how to fix it.

If you plug a 3G card into your laptop, you're going to get less time off your current charge. But how much? The same goes for WiFi, Bluetooth, display brightness, spinning up the disc drive... I could go on. All of these items draw a pretty reliable amount of juice when in use, and can be disabled or avoided when an expert user wants to milk their battery for all it's worth. Unfortunately, most people don't have the know-how to disable this stuff, or just don't want to deal with the hassle, and it means a sub-optimum experience. Even if you do go through all the trouble of disabling every single power sucker, you don't know how much good you did yourself -- maybe you wasted more time shuffling around the Control Panel than you saved on juice? Luckily, the future is already here, just not evenly distributed.

Solution: Android, HP





Ever since Android 1.6, we've been able to easily and quickly toggle the various power drains on our Android phones from the home screen. Instead of diving into sub-menus, it's just a matter of toggling on and off 3G, Bluetooth and WiFi when it matters. I typically get double my battery life if I switch off WiFi and turn my phone onto Edge. This isn't at all rocket science, it just means Microsoft and Apple have to pull all the disparate controls for hardware into their Energy Saving panel and let people have at it.

The second part of this equation is more complicated, but not hardly out of reach. In fact, HP already has power usage software in place for some of its enterprise-class laptops, and this sort of software for IT has been available for a while. A perfect implementation in a consumer desktop operating system means providing estimates of how much power you'll really save by switching off WiFi or dimming the display, which lets users make informed decisions about power saving. Bonus points for a slider that lets someone just choose how long they need to use their laptop for and lets the software figure out the best way to accomplish this.


5. No unified notification tray

Problem: Useful notification is an afterthought in Windows and hacked on by a third party on the Mac.

Many people, myself included, are almost scarily dependent on Growl notifications. When the Snow Leopard upgrade broke Growl temporarily, it send shivers down many spines, but it also illustrated a real issue with Mac OS X: pushed, pop-over notifications shouldn't be some optional feature cobbled together by "the community," they should be deeply integrated into the OS. Windows has been much better about this, but it could be better. There's no true "problem" with the way Growl and Windows do notifications right now, but I think we've seen from mobile phones that the very concept could be rethought and revolutionized.

Solution: webOS and Android




You knew where this was going. Android and webOS have benefitted enormously through their respective notification "trays," with Android's up top as a virtual drawn, and the webOS version presented as a space that's quasi-off screen. The great thing about a desktop OS is that there's room to grow this formula. Instead of just popping up a notification and then hiding it in a drawer for perusal, that very drawer could become a powerful method for directly interacting with respective applications. Without requiring any surface change to existing applications, notifications and controls could be "pushed back."

IM and Email are shoe-ins. Merely provide an expandable text area below the message to enter in a quick response. If the conversation needs your full attention, just click it to view it in Adium or Pidgin, but the majority of conversations can be attended to with a quick reply.

Other applications could provide a set of "what would you like to do next?" buttons to let you deal with their accomplishments. That effects render all done? "Save, Export, or Save and Quit" could give you momentum going back into Final Cut. Like on webOS, a music player's controls could be persistent in the tray. Other apps, if they might be so daring, could also provide shortcuts to popular tasks where you could even drag and drop elements (like a URL or photo from an IM) from the tray to Mail or Chrome or Flickr.

Of course, if made too complicated, a notification tray starts to lose its charm, and there's always room for third party additions to a barebones implementation, but no matter what, I think the "tray" metaphor or something similar is absolutely essential for the future of the desktop OS. It wouldn't hurt the iPhone OS, either!


6. Lack of standardized hardware targets for gaming

Problem: I'm running a "pro" system that was built less than a year ago, and yet I have no idea which games (if any) are even worth installing

Another way to phrase this is "I'm a console gamer." I know plenty of people play some amazing games on the PC, and it's still the ideal system for a FPS, MMO or RTS. But it's a hassle. Services like Steam have made obtaining games easy and often cheap, but at the end of the day you have to be of above-average intelligence to get an appropriate game graphically for your mid-tier desktop or your high-end laptop, and a computer genius to tweak your system appropriately to run the game at its highest relevant settings to your machine.

Even when I've been presented with a best-case-scenario gaming PC, I still drop frames in the messy parts, or have to shut off a few of the fancy graphical touches that make the game look good at the high resolution PC games are known for. Or I drop the resolution and loathe myself for it. I know I'm doing something wrong, or not spending enough on gear, but I'd really rather just pay $60 for that console disc and enjoy a known quantity from my couch.

Solution: iPhone, prayer




If there was an easy solution for this, a steadily marginalized PC gaming industry would've come up with it by now. I certainly don't know the best way to proceed, but I think the contrast between Mac gaming and iPhone gaming is a good example of what it might take. The iPhone's solitary hardware target was obviously a dream come true for developers, who had previously known such joys only on the tightly-controlled handhelds from Sony and Nintendo. Even after introducing the 3GS we've seen games that run well on both tiers of iPhone hardware, with an easy to track differentiation between the two of them. The unitasking nature of the iPhone also lends itself to reliable performance -- you always know how much of the system's attention the app is going to get.

This doesn't help PCs much, however, since there are about as many currently available PC SKUs as there are iPhones in use. One potential help is AMD's new "VISION" branding for PCs. This is mostly useful for differentiating what sort of media handling capabilities a system has, and will obviously become much more of a moving target once a new generation of systems are introduced (you can't differentiate around 1080p forever). There's also a hope that DirectX 11 could take off in a way DirectX 10 never did, providing a new target for game developers: the lowest end DirectX 11 gaming card on the market. But I'm not holding my breath.


7. Cost

Problem: My netbook costs $50 more than it needs to and I can't change the desktop picture for some reason

While PC manufacturers are all busy one-upping each other in feature sets and blasting away at MSRP, Microsoft is in the enviable position of licensing software it's already built. Of course, Microsoft has to recoup its considerable development and support costs, but there's something that feels just a little "wrong" about stripping features out of an OS to sell it cheap to an OEM -- particularly when it's more expensive than Windows XP was. Windows 7 Starter usually amounts to a $50 bump over a comparable system with Linux or Windows XP. Microsoft isn't necessarily abusing a monopoly in this space -- there's nothing theoretically stopping anybody from pumping a couple billion dollars into Linux or some new brand of desktop OS and facing off with Microsoft head-on -- but since there isn't a true Windows alternative, I think Microsoft gets away with charging more than it should for all versions of its OS, and stripping too many features out of its "cheap" editions.

Apple isn't any better, since it simply doesn't give us any choice. Maybe Apple doesn't think people want something akin to an Apple-built netbook, but I'd say there are thousands of hackintoshes out there that would beg to differ. The fear of cannibalized sales is understandable, and so far Apple's choices are paying off just fine on the balance sheet, but it's hard to think of a desktop OS like Mac OS X as truly "current" of "of the times" when most people shopping for a laptop today aren't even planning on spending half the cost of the cheapest MacBook.

Solution: Linux (maybe)




While tablets outnumbered smartbooks by a wide margin at CES, I think the latter category has much more of an implication for Real Computing in the near term. Away from the Microsoft and Intel taxes, $200 ARM-based smartbooks running Linux could offer a new version of computing for the everyman. Google's Chrome OS gambit seems to think so. Intel's experiment with Moblin is also an indication of half of this trend. If an OS can be "good enough" without having to cost an OEM $100 for every PC it slaps it onto, or dictate exactly what sort of parts they can or can't use, then we could be looking at an incredibly different new era of computer. Of course, Linux has to get there first, and people have to overcome their fears of something that doesn't run MS Word and have a blue IE icon on the desktop.

A more expedient solution would be for Microsoft and Apple to slash their licensing costs and drop some of the restrictions. Perhaps it'll take a successful Linux to make that happen?


8. Complexity leading to click abundance

Problem: I can't remember that key command, but hunting through the menus with my mouse is too much work

This might come across as a little esoteric, but I think at some point we lost our way with key commands. Outside of simple things like ctrl+C, ctrl+V and ctrl+S, most key commands are designed for and typically used by experts. There's nothing wrong with that, and I'm happy that experts are presented with a highly efficient method of using their application of choice, but for the rest of us, or for the expert that finds herself in a foreign app, there needs to be a better way to discover functionality than clicking like a maniac.

It typically looks like this: I know what I want to accomplish in Photoshop, but I don't know how. So I start clicking. I click up in the menu bar and shuttle back and forth to see if something catches my eye. I right click on some aspects of the layers menu, I click into some of my tools, I tab through a few thousand pallets. What do I end up doing? I just run to Google. I'm sorry I'm not smart enough for you, Photoshop, but if I have to resort to a frantic Google search every time I want to get something complicated done, there's probably something wrong with at least one of us.

Solution: Spotlight




This is another case where I believe the solution has arrived, it's just not evenly distributed. Apple pushed a "Spotlight" search bar into System Preferences a little while back, and Microsoft has similar search capabilities in its Control Panel. I think those are both cases where a "simpler" clickable interface just wouldn't help much -- my mouse is powerless when faced with such a sprawling tree of controls and functions. In both cases search also do the job of training me where things are so the next time I can find them faster.

The big upside for me is that I love using my keyboard, and this gives my keyboard way more to do outside of text entry and painful key commands. While giving computers voice commands has never caught on for a multitude of reasons, it makes sense that I could tell my computer what I want to do using natural language in text form -- it worked so well for Matthew Broderick in WarGames! I'm sure there are plenty of other things we could learn from the command line interface that could make sense when implemented in a natural language, intuitive sort of way. This also could solve some of the abundant clicking of windows management: instead of dragging and dropping continually, I just "tell" my computer which two tasks I want to focus on right now.


9. Independence from mobile phones

Problem: I'm sitting at an all-powerful laptop, and yet I have to fish a tiny little computer out of my pocket to answer this phone call

I don't think this is complicated at all, but for whatever reason it's 2010 and I'm still routing most of my calls over a highly unreliable network of cell towers. Sure, phones are great, I love having a cellphone and can't even conceive of my life without it, but when I spend 12 hours a day in front of a computer that could handle my phone's tasks so much more effectively, it's almost a little silly to be holding up this hot slab of phone to my face, wondering if I'm going to drop a call.

Solution: Nokia, T-Mobile, common sense




Nokia has already done a lot of this work with its wide assortment of software (that's it's recently started to consolidate under the Nokia Ovi Suite moniker), and we've seen T-Mobile passing cell calls to VoIP with HotSpot @Home for ages, but there's a significant lack of integration and comprehensiveness to these services. The mere fact of being able to view and respond to text messages on Nokia is something I'd like to be able to take for granted -- particularly when some of the biggest phone OS manufacturers (Apple, Microsoft, Google) are intimately acquainted with my desktop.

In a perfect world, if my iPhone is plugged into my computer and I've toggled the related preference, a new SMS message would pop up in my iChat (or that fancy new notification tray I've been dreaming about). I could respond from right on my computer, or if that person happens to be on IM I could even push the conversation over to AIM or Gtalk. More complicated from a back-end point of view, but just as theoretically simple for a consumer would be doing the same thing for voice calls. I should be alerted to who is calling me right from my computer screen, and choose to pick the call up in any number of ways: on the handset, through a Bluetooth headset, or bumped over to VoIP and routed through my broadband connection. If on-television alerts to phonecalls are something that the cable companies can figure out, I'd think it's the least Apple and Microsoft could do on their desktops, if only to save some shred of geek dignity.

Sure, most of these things can already be done in some way or another with the right know how or hackery, but I don't want to spend an afternoon figuring it out and finagling it into my OS; I want it to Just Work like this out of the box.


10. Lack of purpose and excitement

Problem: What has my desktop OS done for me lately?

Somewhere in the last decade, desktop OS builders decided that they'd attained some sort of "good enough" plane. After that point, they merely needed to tweak and add on, but true revamps became rarer and fresh ideas were always wary of trampling on a proven usage mechanism. They became safe and usable and stable (and I love them for it), but they also became boring.

It's not like I don't want the spit and polish of a truly completed OS, and I do greatly appreciate the efforts on the part of Ubuntu to support more hardware, Microsoft to slim down its kernel in Windows 7 and Apple to build Grand Central Dispatch for Snow Leopard. I'm also certain that after releases more along the lines of "maintenance" in Windows 7 and Snow Leopard, Apple and Microsoft are working on big things for their next versions. Still, there's none of the excitement, pace or innovation on the desktop akin to what we're seeing on phones right now, and it's not like there's a lack of new technology or market demand to hold back innovation.

Solution: Try harder

A desktop OS is exponentially more complex than a phone OS, but that doesn't mean I'm happy with waiting a few years for each major update. In fact, with the powerful chips, large size and multitude of input methods at a "real" computer's disposal, I think there's actually more room for new thinking about usability. Phones have been benefitting from their limitations by the mere fact that software designers have to trim the fat and develop to a very simple (often one-handed) UI paradigm. They have to try harder, just to make a phone that wants to be a computer truly usable. Since I can already accomplish almost anything on my laptop with a keyboard and mouse, and I have 8GB of RAM for swallowing up as much wasteful code as you want to throw at me, there's no desperate need for functionality forcing anyone's hand. I can't force their hand with my wallet either, because I'm still buying their products and getting things done with them, but hopefully someone deep with in Cupertino, Redmond or their mom's basement is hard at work on something to take my desktop experience into the next generation of UI.


Wrap-up

What actually got me started on all of this is an editorial by John Gruber talking about what he thinks "The Tablet" project might mean from Apple: basically, a new sort of consumer "computer," a second coming of the Macintosh that rethinks what a personal computer should be, not a mere web-surfing-in-bed slate to keep us occupied for 10 minutes every night. My hope would be that Apple doesn't do away with interfaces like the QWERTY keyboard that have served us so well for so many years, but does present an OS that allows people to utilize their computers in new and exciting ways using touch and other interface innovations, while still accomplishing familiar tasks like updating Facebook, sorting photos and editing movies.

If what we've seen of Microsoft's Courier is legit, it too presents an opportunity to use computers in a new way -- though a total rethink of the primary consumer OS obviously presents more danger and risk to Microsoft. It's in Apple's best interest to get everyone to buy a new machine with a new operating metaphor, while a world where people can get 95% of everything done without Windows (the other 5% involves editing Excel spreadsheets) makes Microsoft surprisingly less vital to the consumer.

Even if I'm pitching it as "evolutionary," this sort of pie-in-the sky, redefined world is likely years away: Windows 7 and Snow Leopard just showed up, and they aren't likely to resign their position of dominance in the lives of productive, communicative people in the near term. But for the sake of my throbbing carpal tunnel, frazzled brain and fragmented UI expertise, I do hope we get there this decade.






Nicholas Read&Share

0 comments:

Post a Comment

 
Home | Facebook

Copyright © 2009 Nicholas's Blog |Designed by Templatemo |Converted to blogger by BloggerThemes.Net