The "You've Got To Be Kidding Me" Moment
OK, here are the redeeming facts as near as I can ascertain them:
You possibly do not need to use the command line in Linux until you're actually getting some use from your system first.
There is a program in the graphical user interface, which you can get to from the System menu, called Synaptic Package Manager. Search among thousands of quality programs and just tell your powerful servant to get them for you. It will cover for your inadequacies without you ever having to sacrifice a chicken to it.
There is an Ubuntu starter guide. Unfortunately since graphical user interfaces change so much it has to give you instructions only for the command line. But still, it will walk you through getting Menu Editor, Java, Flash Player, PDF reader, etc.
Linux is able to associate many file types with a program to automatically open them.
You can right-click a document and choose from a menu of programs to open it with, just like in Windows.
You can, if you wish, set one of the preferences to auto-launch software from CD when it's inserted.
Installing a program is supposed to put it in the Applications menu. This sometimes works.
If you can hunt down the file in your files browser, you can use a menu editor to make it go in the Applications menu.
Right-click the taskbar and you'll be given an option to put something from the Applications menu there as a launcher button.
The problem with Linux adoption by newbies might not actually be Linux. It is surprisingly forgiving. You just wouldn't know that from the help you get. The GUIs change, so the people helping you have no way to know how to play in the shallow end of the swimming pool. They grew up using the command line so that's what you get. So the non-Linux user will inevitably have a "you've got to be kidding me" moment. They will not tolerate very many of those, and rightly so. The upshot of this is that I am going to turn in a feature request in Bugzilla. The feature would be for the install process to end in asking you if you want a little guided tour of the basic, fundamental computing tasks.
For instance, one of the first things the newly installed Ubuntu should describe to someone who requests the tour should be Synaptic Package Manager. Not apt-get. Wait until they are happily using Synaptic to download and install programs, and then tell them how much cooler it is to use apt-get from the command line. But have paper towels on hand to clean the Pepsi that they snort from their nose laughing at you.
Comments
stormgren on Jul. 5, 2005 3:25 PM
If it wasn't for the power failure last night, I would have tried to point you in this direction, but I'm happy to see that you've caught it on your own.
Excellent.
A little FYI tidbit: Personally, I hate apt. On my GUI based systems, however, I use synaptic. Why? Because I can get the software I want installed much faster becuase it's a lot more user and administration friendly. On my server that have no GUI, I'd rather use aptitude or dselect (terminal-graphics driven package managers) to install something. I only use apt and apt-get when I'm installing a .deb file that's not part of the mainstream Ubuntu or Debian packages.
ObOtherFYI: Assuming you haven't already done this, if you want to see everything that Ubuntu can install through synaptic, go to the "Settings" pull-down menu, then click "Repositories". From there, click "Add" and for each selection in the combo box (such as "Ubuntu 5.04 "Hoary Hedgehog") check all the selections and hit OK. NOTE: Do each Repository selection one at a time by selecting one, clicking the check boxes, and clicking OK. If you were to select each item in the combo box, click the checkboxes and click OK, the settings would only take effect for the last combo-box item you selected.
This is somewhat counterintuitive for how I think the menu should behave, so they'll be getting a bug report from me soon, but I digress.
When you're done doing that, click the "OK" button in the "Software Sources" window. At this point, make sure you're connected to the Internet and hit the "reload" button. It may ask you if you want to reload the package information when you hit OK at the step above, but I've had it where it didn't.
It will then refresh and download information about all the packages it knows how to install. Some of these, especially ones that are marked without the little Ubuntu logo in the synaptic window, may not add themselves to the GNOME/Applications menus, but will install. Looking at the package's information within synapic or the package's homepage(s) would tell you the directory it has been installed to. You could then run it by putting the full path (/all/the/directories/plus/application-name) in the "Run Application" box, and then maybe adding an icon for it later.
Hope this helps a bit more.
phecda on Jul. 5, 2005 8:15 PM
When you are building something for the first time, you ususally have to make a decidion as to whether you are building something to be pretty or to be functional. In rare cases you can accomplish both. The path that linux has taken is one of functionality first. Our friends at Microsoft took the view of put out anything and we'll make it work later. After doing this for 15 years, they are finally addressing some of the fundamental function issues that have plagued the windows environment. Keep in mind that DOS was only supposed to be a stepping stone to implementing unix (xenix to be more exact) on the intel platform. Apple managed to master both functionality and form pretty much from the get-go, but this problem is much easier to slve when you control the hardware, also.
Matt, from your standpoint, your experience with computing is from the user interface. Your desktop is a tool that you interact with to create documents, images, surf the web. But, to take image processing as an example, would you really expect the same warm fuzzy experience on your desktop to be present on a render farm? Would you prefer the major performance hit that would be required to support a complex GUI to getting your images processed faster? CLI has its place. The interface is much simpler (Albeit, it requires more expertise from the user) more responsive and more flexible. Depending on the complexity of the shell (and I'm talking real operating systems like unix, VMS and MVS)you can craft complex batch processing scripts, easily customizing your computing experience. You don't like the finality of rm? alias it to rm -i, and you will always be asked to confim deletions. I agree that there should be a shallow and a deep end. Windows and MacOS have not traditionally provided much of a deep end - similar to a kiddie pool, while the deep end on unix, VMS and MVS more closely approximate the Mariannas Trench. Please understand when talking with most linux geeks that we prefer the deeps and are not comfortable in the shallows. The CLI has stayed roughly the same since the early 70's. We know it, we're happy with it. You get a new GUI and you're constatly fiddling with it to find things. Whenever there's a new version of windows I spend literally months trying to track down where they've hidden all the administrative functions. The lack of consistency is irritating at the very least.
But here's a pointer to a discussion on Slashdot on the futures of the Linux GUI. This certainly is the next frontier of Linux development given that Linux now has the power and functionality of Solaris (minus the Veritas bits, but you can buy that for Linux, just like you have to buy VFS for Solaris) or HP-UX. I'll probably get some flames over saying that I think AIX is better than Solaris, and the fact that you don't have to buy a third party file system to have a funcitonal OS is a big issue, and the volume management (included in AIX from MVS)is vastly superior to Solaris or HP-UX.
None of these features matter at all to the typical desk top user, but for the enterprise user, disk management, network communications, and overall processing capability are far more critical than a pretty GUI.
matt-arnold on Jul. 6, 2005 2:24 AM
Whether CLI is indispensable to other people is irrelevant to whether it's an evil thing to inflict on me. I said before, and I'll say again, that CLI has it's place. The place is far away from novices, but that is still a place. I'm not an enterprise user and I don't have a render farm. They can go ahead and use some version of Linux that has not been tailored for the new, non-technical desktop user. But there should be an adoption build. Not one that locks out the potential to use CLI in the future for its power-user features-- but one that heads off a potential disaster at the pass by saying "whoa there new user, don't ask that Linux geek for help about this, he'll just scare you. Take this tutorial written in human-speak about how to avoid having to talk in computer language."
In the past few journal entries I have not been complaining to power users that they have too-powerful an OS. I've been complaining to TUX Magazine, whose raison de etre is that Linux is supposedly usable by newbies from the world of the desktop. They're not the only ones declaring the readiness for widespread adoption either. Nary a word is spoken in any of their issues thus far to clue us in to the reality that we will experience.
phecda on Jul. 6, 2005 8:51 PM
From a desktop usability standpoint, Linux is about where Windows 95/98 was when it first came out. Many things are working, but there's still a slightly kludgy feel to it. Having been subjected to the whole evolution of Windows (including Windows 2.0 running on a 286) I am ecstatic with where Linux is now. So keeping in mind that distros like SuSE and Ubuntu are where windows was six years ago (from a GUI standpoint, not from a functionality/stability/reliability standpoint) and given that Microsoft was pushing that self-same interface as being for everyone, I don't really see the issue with TUX magazine saying that the Linux desktop is ready for primetime.
Six years ago, when someone would come to me to recommend a system to purchase, I would evaluate what they wanted to do, and his or her technical capabilities. For less technically savvy people I would suggest a MacIntosh, because everything worked well, was nicely integrated, and was a very nice computing appliance. They were able to get their work done, and I was left in peace.
Just like some people like to vacation at resorts where every whim is catered to, and some people like to go out into the middle of nowhere because of the challenge, it really comes down to what you want as a user. I prefer not to bow to the Microsoft computing hegemony, and as much as I've been an Apple fan over the last (almost) thirty years, Steve Jobs irritates me. So free software in all it's manifestations is my preference, and I'm willing to put up with a less integrated environment. It also helps that I grew up with Unix and it's a comfortable environment for me. GUIs, while pretty, are very inefficient when you want to accomplish real work. Key combos, while a steep learning curve, are far more efficient than mousing through menus trying to find some obscure feature or do repetitious actions. Typing 200dd in vi is much more efficient than using the mouse to select 200 lines in word and then choosing "delete" from the edit menu, etc. At the same time I recognize there was a significant learning curve for me to learn all the key commands in vi, and someone who is not facile in text editors is going to be more productive faster in a GUI editor. And these days we're all about instant gratification.
Leave a Comment