A very interesting conversation erupted today, beginning when a coworker sent a lengthy email stating his reasons for altogether leaving Ubuntu 11.04’s new Unity desktop interface and instead resorting to the good, old-fashioned Gnome 2 “Classic” session.
In it he makes some very valid points about functionality that’s different to what he was used to. This understandably affects his workflow, so instead of wrestling with a new interface, he chose to go with the old one, hopefully until Unity matures enough for him to be able to customize it to his liking.
What’s interesting was the amount of responses it got, where everyone spoke about their “pet peeves” with Unity. The vast majority were changes in how Unity handles things, that interfered with people’s workflows. It’s understandable that even a small change in how your user interface behaves, when you’ve become adept at working with it, disrupts things enough (and annoyingly enough) that you either go back to the old user interface, or just start fiddling with the new one until you find a way to get things to an acceptable state.
Which is what struck me as curious about this thread: there were basically two camps, those who flat out abandoned Unity for the time being, and those who actually went looking into how Unity behaves and integrates with the environment, and came up with ways to make Unity more comfortable to those used to the “old ways” of Gnome 2.x and its desktop interface.
Without demerit to the original poster, whose points were quite valid, a lot of responses suggested ways to solve about 80% of his complaints about Unity. However, the fact that it took a team of experts to solve the problems that a user (and another expert, at that) was experiencing, is testament to the fact that Unity could still be made more intuitive, easier and more customizable.
I finally upgraded to Ubuntu 11.04 and Unity this past weekend. Like many, I experienced some usability issues, where the desktop wasn’t behaving the way I was used to. However, my use of the system means that I basically want the UI to stay out of my way. So the main change I had to make was to get the Unity dock to auto-hide, so that it only appears when I ask it to. The rest of the time it’s hidden away. Everything else, well, it’s admittedly different than what I’m used to, but that’s change for you. Was Unity making a change for change’s sake? Maybe so, but I think it’s change in the right direction. Even if it somewhat alienates experienced users (for whom, however, workarounds exist that handle nearly all their concerns), I think the true success of Unity is in how it works for new users. And here are two examples.
Another coworker posted his experience with showing Ubuntu and Unity to a newbie, fresh-from-Windows user. The user’s comments were along the lines of “this looks nice”, “It’s easy to use” and “I’m keeping it”.
Also, even though some have complained about the app lens being hard to use (and it’s a complaint I’ve seen already twice), I’ve seen users realize “but hey, if it’s really that messy, you can use the search field to find what you need, right?”. So yes, end users are realizing this, and it’s just a matter of polishing how things work. If all, I think it’s great to move users away from the “the computer has only two buttons” mindset and get them using the keyboard a little more.
So yes indeed, I’m staying on Unity, and I’m looking forward to seeing it maturing into a better desktop interface. as Mark Shuttleworth said, it’s a foundation on which the next generations of Ubuntu user experience will be built. I’ll be thrilled to be along for the ride.
Finally, for a great write on why your desktop changed, and why the developers would appreciate you giving it a whirl and helping improve it (even just commenting on the stuff you find hard, unintuitive or just plain wrong) is better than just swearing off these newfangled changes (without which, face it, you’d still be using fwm and MIT Athena widgets), please drop by Federico Mena-Quintero’s activity log and read his wonderful and short article “Moving into your new Gnome 3 house“.
I remember an easier time when all keyboards had the same layout (C-64, anyone?) and if you wanted to type special characters you had to resort to arcane command sequences, if they were at all possible.
My, how times have changed.
My first PC compatible had a spanish keyboard, and you could very simplistically tell the OS (MS-DOS) about your keyboard layout. For a while this worked pretty well. Then someone decided that Latin America was so different from Spain, that we needed our very own keyboard layout; this layout just moves stuff around needlessly, destroying many years of experience for those of us who were accustomed to the spanish keyboard. I understand removing the ç as it’s not used in Latin America, but why move all the rest of the stuff around?
So basically I got used to the spanish keyboard which has worked well in all kinds of OSes, from MS-DOS to Windows, OS/2 and yes, Linux.
While the Latin American layout was such a pariah that, at some point, it got overwritten by the Latvian keyboard (la), so when doing a system upgrade, all of a sudden your keyboard was in latvian, and you had to select “latam” for Latin America.
Eventually I happened to get a laptop with a Canadian French keyboard. Luckily, this is not the dreaded french AZERTY keyboard, but basically an english keyboard layout with most symbol keys mapped very strangely. So if you want to type the basic alphabet you’re OK, like you’d be with an english keyboard, but things start getting weird when you need to create special characters or compose accents, cedillas and stuff like that. This was so different from any other layout I’ve used, that I was basically freaking out. I could just ignore the red characters on my keyboard, and/or use it as just an english keyboard, but I routinely need to compose text in spanish and in french, so how would I go about doing this?
And no, the ages-old trick of memorizing ASCII codes for special characters doesn’t cut it: for one, it’s unreliable on Linux (especially on graphical mode), and for another, it’s just primitive! I used to chuckle at all the people I’ve seen through the years who had a nice “cheat sheet” glued to their desktop with ASCII codes for frequently-used accented characters, as opposed to taking 15 minutes to correctly configure their keyboards to do this natively.
So anyway, what I came across while checking out the available keyboard maps under Linux and trying to figure out how to type stuff on the Canadian keyboard, was this wonder of wonders, the US International with AltGr Dead Keys layout.
Basically, it takes the right Alt key (labeled AltGr on my keyboard, a monstrosity I was already used to from the LatinAmerican and spanish keyboards) and uses it to “compose” or “deadkey” stuff (dead keys are like accents, for instance, where you press the accent key and then the next letter you type will be accented). In combination with ~, “, ‘ and `, this enables me to type nearly all accented characters with relative ease.
Also, I can use AltGr+vowel to type acute-accented vowels (áéíóú), and AltGr+n for ñ.
Grave accents (è) and tilded letters (ã) can be composed by AltGr+accent (use ` for grave, ~ for tilde), and then the letter you want to type.
What I like about Linux’s keyboard selection thingy is that you can see an actual layout map. Thus, even if my keyboard doesn’t have the characters stenciled in, I can take a quick peek and see where stuff I need might be.
Thus I can do things like use ç or €, all with a minimum of fuss. Also more complicated stuff like ï œ ø is still just one AltGr+key away. All this while preserving a layout that’s very familiar to everyone (english), and where most strange characters using while programming {}][\|~ are also much easier to use than on the spanish keyboard I was used to (it needs AltGr for all sorts of braces and piping, which makes it very painful on my hands).
So there you have it, if you see yourself wrestling with choosing a good physical keyboard layout *and* making it work on your OS, stop pulling your hair out, get an english-layout keyboard and use US International with AltGr Dead Keys!
Last week during Ubuntu Developer Summit, head honcho Mark Shuttleworth said something to the effect of “iOS and Android have managed to succeed despite Microsoft’s monopoly, and we haven’t” (see the keynote here). As a few days passed I thought about it a bit and here’s what resulted.
I think it’s not quite as clear-cut as “they have done it and we haven’t”. Microsoft’s monopoly is on the desktop, and it is there that Ubuntu is going directly against Microsoft and perhaps, yes, failing to capture a percentually significant chunk of the market. And I won’t go into the whole “Linux is better than Windows” debate.
Rather, let me point out a key fact about Android’s and iOS’s success: they both did so in a market where Microsoft wasn’t a dominant player. Before Apple unleashed the iPhone on the world, the smartphone market was very fragmented, with Microsoft a relevant player (Windows Mobile), but nowhere near the dominance it has in the desktop. Nokia (Symbian) and RIM (Blackberry OS) were two big players, but they have both been relegated, one to irrelevance (Nokia – plus the deal with Microsoft), the other (RIM) to a frankly defensive posture where they lack a strategy and are scrambling just to stop the exodus of users.
Now, even while Apple and Google are the two strongest players in the smartphone market, things are pretty much in a state of flux, and no platform can claim the stranglehold that Microsoft has on the desktop. So those companies are forced to innovate and stay on their toes. But the fact is that, even with a product that is clearly superior to previous offerings, any one of these companies would have had a hell of a time dethroning a hugely dominant player from the field.
Ubuntu’s challenge is greater as it’s going head-on on Microsoft’s cash cow, and there’s no real competition for the desktop. The only other mainstream operating system with any success is Mac OS X. Apple is content with whatever niche they’ve carved for themselves, and it’s clear to anyone that the strides they’ve made in the past decade are more due to the halo effect of the iPod and iPhone than because of OS X’s (admittedly great) merits. So yes, they have a superior product, but that still hasn’t propelled them beyond a 10% market share. While I’m at it, let me comment: it’s easy to forget that the first versions of OSX were kludgy, slow and difficult to use, and had a myriad usability problems. It was the iPod and then the iPhone that propelled Apple from a fringe player into the powerhouse they are today. In the end, Apple realizes that promoting Mac OS X is not worth a big effort, and that the momentum from the iPod and iPhone are enough to keep OS X alive.
So what does Ubuntu need to succeed on the desktop? I have no insight in this topic, but let’s just realize that it’s not as clear-cut as looking at, and imitating, Android’s and Apple’s successes, because as I’ve said, their playing field was a vastly different one. Would a “halo-effect device” help Ubuntu the way the iPhone helped Mac sales? maybe. Maybe all Ubuntu needs is endurance, as even hugely dominant players (Ford, IBM, WordPerfect, Netscape) can be surpassed under the right circumstances.
During Ubuntu Developer Summit (UDS), held last week (May 9-13) in Budapest, Hungary, a very interesting program was discussed. It’s the Ubuntu Friendly program. The end product of Ubuntu Friendly should be a way for people to find out whether a particular computer system can run Ubuntu properly (it’s friendly!). This has many possible uses, not the least of which is to enable people to choose a system on which Ubuntu has a good chance of working without much tinkering by the end-user. This is important in a world where most people compare a preinstalled Windows system (which has had most of the dirty driver installation/enablement work done by the manufacturer) with a from-scratch Ubuntu installation, where Ubuntu is expected to pick up all the hardware and work adequately with it.
Due to this last scenario/requirement, in my opinion, installing Ubuntu is already a much cleaner/friendlier experience than installing Windows; on my laptop, a Samsung QX410, Ubuntu has a few glitches which require manual configuration (touchpad, hotkeys), but the system is immediately usable out of the box. The same can’t be said of Windows, where a plethora of device drivers are required for even the most basic devices to even work. However, since the system is purchased with this work already done, to the user’s minds, the Ubuntu experience is not as polished as the Windows one.
I digress. So the Ubuntu Friendly program seeks to award Ubuntu Friendly status to those laptops the community finds work well with Ubuntu. This in a way replaces the Ubuntu Ready program, where manufacturers were responsible for running the tests on their systems. It also complements the existing Ubuntu Certified program, where hardware is tested in-house by Canonical, under more stringent standards, and an official Canonical-sanctioned certificate is issued to machines that are deemed certified, as in, work with Ubuntu out of the box.
Needless to say, there’s a lot of interest from the community in this Friendly program; it’s a win-win situation where the community can contribute valuable testing and results, and the world becomes a better place through what will be a large list of systems that the community has identified as being “Friendly” to Ubuntu.
During UDS, the sessions where this program was discussed had great success; attendance was good, and I was glad to see people from outside the Hardware Certification team in Canonical participate. Yes, there was a lot of interest and participation from community members too. There were a lot of valid concerns and good ideas being talked about, and even though an extra session was scheduled for this program, they all ran out of time with people still wanting to participate.
All in all it’s a very interesting program, one that hopefully will take form quite soon. If you’re interested in seeing what this is all about, here’s the blueprint with a more formal description of what is being planned.