My basic Linux questions
GHoosdum
Icrontian
Since I've retired my desktop system from day-to-day use, I'm planning on redeploying it as a test bed for various systems and programs that I want to learn. One of these things is Linux.
Can you smart gents recommend a distro for me? I'm not afraid to learn something new, but I also want something that has the potential to become something of a future standard in the Linux world, i.e. something that has a lot of potential to become a possible future challenger for Windows. I know there are a poop-ton of distros out there, but it may be as simple as a specific desktop environment becoming the challenger in the future, rather than a specific distro.
Now, onto the other question: is it necessary that I run an nVidia card in order to use the GUIs, or is there some sort of reference driver that allows ATI cards to run without all the bells and whistles turned on?
School me up here!
Can you smart gents recommend a distro for me? I'm not afraid to learn something new, but I also want something that has the potential to become something of a future standard in the Linux world, i.e. something that has a lot of potential to become a possible future challenger for Windows. I know there are a poop-ton of distros out there, but it may be as simple as a specific desktop environment becoming the challenger in the future, rather than a specific distro.
Now, onto the other question: is it necessary that I run an nVidia card in order to use the GUIs, or is there some sort of reference driver that allows ATI cards to run without all the bells and whistles turned on?
School me up here!
0
Comments
many people use ubuntu, so alot of the questions you may have, may already be asked/answered by searching google.
give it a go
ATi has a driver for Linux just like nVidia does the only different being that the ATi one is crap in comparison. So you don't need to change anything you do and you wont lack all the bells and whistles, you just wont get as many FPS in Doom 3.
Ubuntu is nice, but it may be a little too... "bleh" for you. It's made for people that have never used Linux and aren't really smart enough to use any of the other Distros, heh (although I don't mean that everyone using it is a moron).
But yeah, differences from Distro to distro are: Where certain base files are kept, how the base system is maintained what package system if any it uses (to easily install popular software and their dependencies).
I can't really recommend a distro. If you choose to go with Gentoo then I can help you out but I don't really have enough experience with any others to say either yay or nay. Advantage with Gentoo is that say you installed it 6 years ago, it would still be the latest version and up to date today, yeah, nice that, lol it's because the entire system is modular, so everything is constantly upgraded, nothing left out. I'm running it on my Athlon64 X2 system right now, I'm also running it on my old Athlon 500Mhz with a 4GB HD happily (far faster than Windows 2000 ran). So that goes to show how you can make it to suit any need aswell (although that's more Linux than Gentoo, but Gentoo is all about maximum control and customisability so it does make it easier).
i dont understand why people say that, i can install Ubuntu and get it up and running within 30mins.
trying to achieve the same task with any other distro will take longer and more complicated (for me that is), does that make me dumb? lazy?.. Ubuntu is as stable as any other Linux distro.
so, if it doesnt take long to install and is too easy to install, it makes it.... bad?
note: please dont get me wrong, im not flaming you
http://gentoo-wiki.com/HOWTO_X850XT_ATI_Drivers
Setting up ATI drivers in ubuntu dapper
https://help.ubuntu.com/community/BinaryDriverHowto/ATI#head-5ead174a0b3294527486cd4d71ded66b40003f25
I'm definitely taking it all in. Keep it coming.
Its more or less made for those who want to USE their desktop with the Linux OS. I really would not bother with any other distro until you try Ubuntu out, it would probably end in tears. Or maybe just a format of the HDD
What you might encounter on the interweb are alot of Linux elitists who are closet masochists who enjoy spending hours recompiling a kernel to add functionality obscura or tweak out meaningless performance gains on ancient hardware. Please ignore them.
is it reaaally far faster? are there any benchmarks to show this or is it a "feels faster" thing.
Oh I agree with you, it just doesn't have the level of erm, "control" that I like, it's probably overkil for most people and as I said, Ubuntu doesn't "have" to be for non-technical people, I was just thinking that if Ghood is like me then he'd prefer to have more... tweaking ability.
Don't need a walkthrough for that.
"emerge ati-drivers" then change the graphics driver option in /etc/X11/xorg.conf to frglx instead of vga.
Screw you too "buddy". There's no such thing as a "powerful" distro either, the terminology doesn't make any sense in the context. And I do "USE" my desktop with the Linux OS, kthx.
Considerably. Don't get me wrong, running Windows 2000 on THIS machine wouldn't be sluggish, but on that one I'm using a Window manager that has no dependencies and is 600k in total so it starts up within a second or two and MPlayer only takes about 2-3 seconds rather than VLC or Media Player Classic on 2000 (on that machine) that took about 6-8 seconds, programs on it just seem to have a reduced load time (maybe the programs are just smaller? Although optimisations do have more of an effect when they are really needed like in this case). It always seemed to churn the HD more (couldn't have been the RAM though as it has nearly 400MB). The advantage is that I can cut off bits wherever I want with Gentoo on that machine where as Windows is (except for non-essential services) set in its ways). On a normal machine that wouldn't be too much of an issue, but on a machine like this, stripping is good.
At this point I'd have some trouble finding a truly bad distro. I seem to get the most tech support requests about Fedora but that could just be pebcaks. I'm going to break down my impressions of a few distros and you can decide for yourself.
CentOS and Fedora:
These are both very similar in that they use yum for their package manager which uses .rpm packages. Pros of using .rpm is that it is the most widely available form of binary package and that as a binary you don't have to mess with maintaining a working toolchain (gcc and friends). Cons include using .rpm files you find on the web are almost guaranteed to toast your install (RPM Hell) and that the binaries are optimized to run on the widest variety of hardware. Try not to install packages that aren't part of yum's carefully-manicured package dependency tree. Each distro using yum maintains its own package tree which generally has a larger quantity of non-free software than distros not using yum.
Fedora likes to take care of you and install a whole lot of software that you may not know what it is or what it does or even need by polling you during install. For instance, if you say you want "Games" it will install several common GNU games, "Office PC" nets you a few office suites like Evolution or OpenOffice.org and so on. When this software breaks or needs updating you will find yourself making more threads in my forum . I imagine CentOS does much the same thing but is more geared towards servers. As far as I'm aware, the packages get installed with a default configuration and you will be required to seek out and modify the config files yourself if you want to deviate.
Debian, Ubuntu, and kUbuntu:
All of these use Debian's package managers apt and dselect which use .deb packages. .deb files are less common than .rpms on the web but are a lot less likely to trash your system. Install-time package selection in Debian is done through dselect which is an absolute turd of a software. If you go with Debian, only install the "minimal" set and install more packages after you restart using apt-get. I believe Ubuntu and kUbuntu have their own installer which doesn't use dselect. Contrasting strongly with dselect is apt and the more-often used and vastly more user and system-friendly apt-get. Apt-get maintains a fairly comprehensive package tree including some non-free (read: not GPL) software that can be installed and dependencies resolved with a simple command. Ubuntu and kUbuntu use most of the same packages as Debian's package tree with a few exceptions. The Debian package tree is smaller than yum's but you probably won't notice. One of the nice things about .deb is that it includes source packages in addition to binary ones so you can get some The downside of .deb is that configuration has to be done manually in dselect and isn't done automagically by apt-get.
All three of these offer to take care of you in much the same way as Fedora and CentOS but also allow individual package selection. I wouldn't do it under Debian because it will install old versions of the software off of the install CDs with obsolete dependency trees all of which will need to be updated when you get Internet access. I don't know but I believe Ubuntu and kUbuntu fetch packages off the Internet during install so it isn't such a problem.
Gentoo:
Gentoo uses the portage package manager which is inspired by BSD's ports package manager. Portage uses .ebuilds to tell the machine about packages but an actual .ebuild is just a text file containing the package's dependency tree, a set of compiler flags, and the filename(s) of any files off the Internet needed to install the package. Portage maintains a list of all of the most current ebuilds on your hard drive and figures out dependencies on the fly whenever this cache is updated. Installing software using portage is as easy as using apt-get but most of the configuration is done automagically though portage tells you when you need to update a file manually.
Most software for Gentoo is source-only. This means that Gentoo has very little non-free software in its package tree. This doesn't mean that you can't use that software on Gentoo, merely that you can't use portage to maintain it. The advantages of using source packages has been said already but I'll mention it again for redundancy: packages compiled from source on a properly-configured compiler are optimized for the hardware configuration of the target machine. On my machines this usually equates to using SSE instructions instead of 387 instructions for floating point calculations on all machines that support SSE, O2 optimizations, and turning off compatibility modes for processors I don't plan to use. I don't have a way to benchmark how how much faster SSE is than 387 but you can use your imagination. Unfortunately, compiling packages takes some time so initial installation will take awhile.
Portage uses USE flags to tell it what features should be enabled or disabled on software. Portage maintains a list of globally defined USE flags that generally shape your system but they can also be defined individually for each package. This is useful if, say, you wanted PGP support in Thunderbird: first you would set the enigmail USE flag for Thunderbird and tell portage to install thunderbird. Portage would see the enigmail flag and install enigmail together with thunderbird and configure everything for you. It's also useful if you want to enable some application-specific optimizations like 3dnow and SSE acceleration in mplayer. Or specify one library over another, like using FFmpeg instead of Xine or GStreamer in mplayer.
Until recently, Gentoo did not have an installer but left it up to you to get it installed from a LiveCD. There is a very well-documented procedure for this in the Gentoo Installation Handbook but you need another machine with a working Internet connection to read it while you install, a printed copy, or enough Linux savvy to fire it up on a virtual terminal on the target machine. LiveCD installs are more difficult than using an installer but in my opinion it isn't so bad when the manual is so good.
Gentoo now includes an installer that takes care of most of the hard parts for you but less of what's going on is explained. Gentoo has the most extensive collection of HOWTOs for getting software installed and configured and strange hardware to work. Their forums are also the nicest and most helpful of any place I've ever gone to get help.
I hope this helps.
-drasnor
Yeah, I've met them. However, if you use Gentoo you'll want to compile your own kernel which isn't as hard as people make it out to be. It can be wierd for other distros though, especially those that use ramdisks to boot. It usually only takes me ten minutes to configure a kernel but the first time should only take 30-45 min. All you need to know is what's in your computer and what you want to do. If you don't mind long startups the generic kernels are fine for most people.
-drasnor
Consider the case of the Linux router. It takes a very different kernel than a desktop PC. A typical desktop kernel would only have ipv4, ipv6, and iptables/netfilter (Linux firewall) support compiled in or compiled as modules. My router kernels usually feature multicast routing, QoS routing, IPsec VPN, and full NAT masquerading. Most of those features aren't present in the generic kernels. There are other examples too like SElinux but I don't use it so I can't comment.
-drasnor
OK, so then when recompiling a kernal, to stitch together the functions to the kernal that will best serve the specific user's needs, how is this done? Coding? Command strings? Both?
Config is the nastiest and the second-most difficult way to configure a kernel. Config asks you questions and configures each option sequentially without any sort of help.
jormungand linux # make config
HOSTLD scripts/kconfig/conf
scripts/kconfig/conf arch/x86_64/Kconfig
#
# using defaults found in .config
#
*
* Linux Kernel Configuration
*
*
* Code maturity level options
*
Prompt for development and/or incomplete code/drivers (EXPERIMENTAL) [Y/n/?]
Menuconfig is my favorite way to do it since all you need is a working libncurses installed. Menuconfig is done from the command line but it fires up a DOSshell-esque menu-oriented configuration tool. There's some help available for the various options.
Xconfig/gconfig is another way to do it using a GUI interface. I find that menuconfig is faster since I don't have to go back and forth with the mouse. The same help available in menuconfig is also in xconfig/gconfig. You would use xconfig if you have the qt libraries installed (eg you use KDE) and gconfig if you have the gtk libraries (eg you use GNOME).
The nastiest and hardest way to configure your kernel is to fire up nano, vi, emacs, pico, or whatever you use for console text editing and open .config in the kernel source directory. There you can set the 1's and 0's next to each of the options.
-drasnor
Not true. Running the ATi config program just does what I said and then breaks your xorg.conf file by re-writing all the default values (I mean everything, the list of modules loaded, location of directories etc) STAY AWAY FROM IT. The only things it change that needs to be done is the one I mentioned. It's basically a hacked version of xorgconfig.
-drasnor
There is a .example one there by default. The one ATi creates wont work as it points the directories to the wrong place and everything else. So, you could do what I say, change one value in the real example file and have it work, OR use ATI's config program and break your X installation. Pretty clear choice isn't it. And yes, I'm speaking from experience as my Laptop is ATi.
-drasnor
It may be an issue with Xorg versions then as they change directories for different files in different releases where as the ATi config program normally seems a step or two behind.
i've always been a big fan of debian. apt is an awesome tool. i hate rpm based distros.... at some point you'll get stuck in dependancy hell trying to update some package you want. it can be a pain. ubuntu is based on debian, and is easy to install, but debian also has a new installer that is very easy. http://www.us.debian.org/CD/netinst/ for the iso images or check out the wiki for more info:
http://wiki.debian.org/DebianInstaller
i used the new DI to build a custom install cd for specific dell servers with a particular raid device... it's ultra handy and the machines are rock solid. still in use as http and mysql servers, been up and running for 2.5 years now (with a little break every now and then for patches/updates and making sure there's not a lot of dust in there).
plus my desktop is an amd64 running debian, has been since the day i got it. i run two ati cards, but they arent anything spectacular. all i do with it is develop code, and check my email. and code. and code code code code...
I always keyword the untested graphics card drivers because it's very rarely that the latest version breaks my system.
-drasnor
as far as the DI, it looks like m68k is one of the supported architectures. so is amd64, but it still isn't final for some reason... but it does work, heck i used it in december 04 to install with. it's listed under the daily builds.
hey the i386 and amd64 netinst cds are graphical installers!
yeah dselect was just about as awful as it got. i used it once and thought, why on earth am i doing this.... apt rendered it pretty much useless. i'd install the base system and then use apt-get to add whatever else i wanted and it took care of dependancies. i havent thought about dselect in a while.... it's amazing to see how far linux in general has come.
If you are new to linux and would like to start out learning I would recommend nothing other than ubuntu. Yes, ubuntu is based of debian which is great because you can use all the debian packages. Another thing that makes ubuntu so great is stuff just works out of the box. Amazingly well at that. As you dive more into linux though you will find out it is very customizable however, so with the first-off simplicity you don't really sacrafice anything. Lastly the reason I would recommend ubuntu is the community. The ubuntu forums are the most helpful linux community I have participated in. No matter what you problem or question is, you will find an answer on their forums. (Plus the unoffical ubuntuguide.org is great).
If you want to start with something a little less intimidating I strongly recommend Ubuntu. The newest release (Dapper Drake 6.06) has some really nifty eye candy as well and your friends will be like :o~~~ haha.
If you are on the fence about using it, download the live cd and try it out before you do anything drastic, i promise you you will be impressed.
Also, for the time being I would recommend still sticking with the i386 install because many apps and packages are harder to install (or worse, they dont) under the amd64 install... it requires more 'hacking'
anyways if you have any questions please ask!
cheers
as far as the amd64 stuff goes, we can help him set up a chroot 32 bit environment to run those x86 apps right?? lol
Anyways, since I'm on sat broadband and subject to the FAP (fair access policy) which limits how much data I can download over a 3 hour period, d/l'ing iso's are pretty much out of the question. So someone told me that Ubuntu will send you free disks of their distro and I went and ordered a 3 disk set today, containing a 32 bit disk, 64 bit disk and a mac disk. And they send it for free; don't even charge shipping.
So anyways, in 4-6 weeks I should have the disks in hand and I will be trying out Ubuntu on my X2 folding rig. But I don't know if I should go with the 32 bit or 64 bit version. Since it's main focus is folding, it won't gain any speed by using the 64 bit version. What about it you experts?