Discussion
  • Read More
    DylanerWhitson Gordon
    7/01/11 6:22pm

    This is all excellent advice for someone. TheFu, you know your stuff! I like it :)

    (And this is bookmarked).

    However, I think it goes way further than this question is asking, mostly because it's more of a general guide for people who are cozy with Linux already. Also, it tends to bother me a little when an article aimed at someone new to Ubuntu presents the CLI way of doing things as if that is the _only_ way. Personally I love the command line, but really, this stuff doesn't need it. If it did, we would all be doomed. Allow me to provide an Ubuntu-specific GUI translation for fellow simple folk :)

    Updates:

    Your Ubuntu system will check for updates routinely and the update manager will pop up on its own. (Technically this happens every two weeks if you have updates available, or when you have an important update waiting). If you want to check on your own, the program is called Update Manager and you can get to it by searching for it or in the System›Administration menu (depending on which version of Ubuntu you have).

    Backups:

    I would like to add Deja Dup here. Two reasons I recommend it: first, it's easy to use (of course). Second, it will be installed by default in Ubuntu 11.10, so it will be very well supported in the future.

    You won't need to worry about system files unless you start editing them, naturally.

    You can install Deja Dup from the Software Centre, and you can configure it to run regular backups.

    Clean Up Temporary Files:

    You can also use Go › Search for Files in the file manager and enter ~ there. The find command, naturally, is _excellent_ and if you're interested in command line stuff that's a nice place to start!

    Removing software:

    If you're installing stuff through Software Centre or Synaptic, you can remove stuff from there as well. A nice thing with Ubuntu is those two and apt-get plug in to the same central package management system, so they all have the same picture of what is installed and how to remove it.

    Defragment?:

    There's some disagreement on whether defragmenting is really unnecessary. It is certainly less necessary than ntfs used to be and we can go pretty happily without worrying about it.

    As the author mentions, Ubuntu automatically runs fsck every 20 times you start up. It will tell you in the startup screen, at the bottom, when this is happening. (Don't worry, it's very quick nowadays).

    The full disk thing has hit me before, and indeed, important advice. Err, as a general idea, just try not to fill up your disk all the way. (I don't remember if it still happens, but I have not-so-fond memories of a full disk breaking graphics).

    Clean Your Registry?:

    Linux software _sometimes_ uses dot files. We actually have a few ways applications store configuration. Lots of modern applications put their stuff in three places, where ~ is your home folder: ~/.config, ~/.cache and ~/.local. (".*" files are hidden; you can press Ctrl + H in the file manager to show them).

    Some software also puts settings in a system called gconf, which you can edit with gconf-editor. In newer versions of Ubuntu, gconf is being replaced with dconf. It might look kind of like the Windows Registry, but it won't explode and break everything. I doubt you'll need to worry about it, except to know that's where settings are (sometimes) stored. System settings go in their own places, separate from that, in places like /etc.

    Regularly Reinstall to Clean up Cruft?:

    In theory, every single file in your system, outside your home folder, should be known to the package manager (which Software Centre and Update Manager talks to). Of course, this breaks if we install software with a script or something instead of through the package manager.

    To keep things tidy, if you are going to install something, look for it in Software Centre first. If it isn't there, see if there is a .deb file available :)

    Reboots Needed After All Patches?:

    In addition to what TheFu said, applications you run yourself are not restarted automatically, and neither are the libraries they load. If it's a critical security update for Firefox and you tend you leave your computer running for weeks, it might be a good idea to close Firefox and start it again.

    Graphics Driver Updates:

    The Additional Drivers tool in Ubuntu will let you know if there are different drivers available for your hardware. Ubuntu has free drivers for just about everything out of the box, but NVidia and ATI graphics cards also have official drivers from their manufacturers. Don't worry about getting them from the manufacturers directly (actually: don't unless you really have a good reason to!). Instead, let Ubuntu take care of that automatically through the additional drivers tool or through Software Centre.

    Ubuntu also gets driver updates with the regular software updates, and thanks to a really awesome thing Dell made the drivers will adjust to kernel updates on their own.

    On the proprietary vs. the default free graphics drivers, I agree with what TheFu says. As a rule of thumb, the proprietary ones are usually better tuned to the hardware (eg: really fast 3D), while the free ones are usually better integrated with the system (display mode only needs to be set once, maybe working suspend if you're lucky). Depends on your needs and how fancy your GPU is, really.

    Reply
    • Read More
      TheFuDylaner
      7/03/11 1:39pm

      Excellent advice from a slightly different perspective. Thanks. I tried to promote your comment without a reply first.

      Restarting Firefox weekly is probably good advice for everyone, regardless of platform. I'm shocked at the memory use of that program. It definitely grows over time, which tells me they are leaking memory or losing pointers. Memory leaks are bugs too, people. On this PC, FF is using 684MB of RAM with only 8 tabs open. This VM has only been running for 2 days. I honestly don't understand how a browser without any plugins (no flash, not java) uses THAT much RAM?

      Reply
  • Read More
    Java-PrincessWhitson Gordon
    7/01/11 2:23pm

    Nice one TheFu. I don't do linux and most linux guys are pains in the ass. This is the first article I ever read concerning linux that doesn't bash Windows. Enjoyed the read.

    Reply
    • Read More
      Whitson GordonJava-Princess
      7/01/11 3:03pm

      My thoughts exactly. I was so happy to see an article that pointed out the strengths of Linux maintenance wise without being a huge dick. TheFu did a great job.

      Reply
    • Read More
      TheFuWhitson Gordon
      7/03/11 1:58pm

      Shush. Don't tell anyone that I use Windows, sometimes. I actually had Windows95 installed about 4 months **before** the "gold" release. Where I worked was an early reference site and we were required to have over 2000 desktops with Win95 running on the release date.

      At my next job, I refused to downgrade to WinNT 3.51 and used Win95 until NT4 was available.

      Sure, Windows sucks, but so do Linux and OSX, just in different ways.

      Many thanks needs to go to Whitson for re-publishing it and fixing some of the wording and mistakes in my version. Any remaining mistakes are mine, 100%. ;) The re-publishing did work. Traffic was only 2x normal daily traffic even with 2 articles published on the same day. Thanks again.

      Reply
  • Read More
    donniezazenWhitson Gordon
    7/02/11 12:56am

    You don't really have to do anything except keep it updated. I occasionally use Bleachbit to clean my system.

    Reply
  • Read More
    alcaldeWhitson Gordon
    7/01/11 4:10pm

    I'd have to disagree with the notion that Linux doesn't need defragmenting. One immediate note is that there are many file systems Linux can use, so the answer to this question depends on the specific file system being used. If we're assuming the most common file system, ext4, it's still not accurate to say that it doesn't need to be defragmented. It can become fragmented just like NTFS, except under different conditions.

    Let's say a hypothetical file system has 300 "blocks" for storing data (picture a bunch of boxes running from left to right). NTFS starts writing data at the first available free block. If you had to write a 100 block file, a 3 block file, then a 40 block file, the 1st 100 block file takes up the 1st 100 blocks, the next 3 the 3 block file, then the 40 block file. Delete the 3 block file. Now if you need to write a 20 block file, NTFS is going to use the first free 3 blocks, 101-103, jump over the 40 block file, then write the remaining 17 blocks after it. Your file is now fragmented.

    When ext4 writes, it attempts to leave as much space as possible between files to avoid the fragmentation we just saw with NTFS. Ext4 may write that first 100 block file in the middle of the 300 blocks, the 3 block file in the middle of the free space to the left, and the last 40 blocks in the middle of the free space to the right. If you delete the 3 block file then need to write a 20 block file like in the NTFS example, there will be no fragmenting!

    That's what Linux users love to talk about. Now here's what they don't like to talk about. :-) Ext4 is maximizing the free space between files, while NTFS is maximizing contiguous free space. It's a polar opposite approach which means each's strength is the other's weakness. We've seen a situation where NTFS encountered fragmentation while ext4 did not. Let's exploit ext4's weakness and see how the situation changes. Let's write 7 10-block files to our 300-block partition, then a 50-block file. NTFS packs the files contiguously, with the 7 10-block files taking up the first 70 blocks then adds the 50-block file after them. We have 120 blocks used, 180 blocks free, no fragmentation.

    Ext4 puts the first 10-block file in the middle; that leaves 290 blocks free, 145 free blocks either side. With a little bit of rounding, with the next 2 files we now have 4 free areas of 67 blocks each (technically 2 67 and 2 68). After the last 4, we have about 7 free areas of 32 blocks apiece.

    (free space size)

    300

    145 - 145

    67-67-67-67

    32-32-32-32-32-32-32

    Now let's write that last 50-block file. Whoops, there's not a large enough free space! Ext4 is going to have to use 32 blocks in one area and then put the remaining 18 blocks in another. Fragmentation!

    Ext4 now has kernel support for online defragmentation, which also serves to highlight the need for it. :-)

    Reply
    • Read More
      Duijfalcalde
      12/31/11 6:39am

      While what you state is true, file-systems in the real world are much larger than 300 blocks, which is only a few MiB at a block size of 4096 bytes. This means that write operations of small files, which are really prone to getting fragmented on NTFS, do not lead to fragmentation on ext4.

      Reply
  • Read More
    ArtInventWhitson Gordon
    7/01/11 6:33pm

    I would say, omit the bit about temporary files, unless you really notice a shortage of disk space. The worst thing you can do is to delete something you really need in the interest of saving a few tens of mb or something. A 1TB hard drive: $90.

    The last bit about 'just do these at least two things and you'll be fine' - excellent advice.

    As for fsck, the best advice I have about this is to figure out a schedule to do this WHEN YOU DON'T NEED YOUR COMPUTER, because when it's set up as default to do it every 25 boots or so, you will no be aware of when exactly that will be and find yourself waiting for your box to boot and it may take a half hour or so to run the check. Not fun. Do it regularly at night before shutdown or something and reboot in the morning. Much less not-fun.

    As for comparison to Windows, I probably do less to my Windows boxes. I prefer to get Windows where I want it and NOT SCREW WITH IT. Updates seem to solve one problem while creating others (lesson: create system restore points!!) But maybe that's just me. I do a software update on Ubuntu at least once a week, (it's a very satisfying and straightforward process actually) and find that the box just runs better and better over time. And I've never created a 'system restore point' on Ubuntu, wouldn't know how to do it, and never needed one.

    Reply
    • Read More
      TheFuArtInvent
      7/03/11 1:26pm

      fsck shouldn't take 30 minutes unless there are issues OR you are n0t running a journaled file system. For fun (I'm sick in that way) I umount'ed a 750MB ext4 partition and ran fsck against it.

      $ sudo time fsck -y /dev/sdb2

      fsck from util-linux-ng 2.17.2

      e2fsck 1.41.11 (14-Mar-2010)

      1TB has gone 197 days without being checked, check forced.

      Pass 1: Checking inodes, blocks, and sizes

      Pass 2: Checking directory structure

      Pass 3: Checking directory connectivity

      /lost+found not found. Create? yes

      Pass 4: Checking reference counts

      Pass 5: Checking group summary information

      1TB: ***** FILE SYSTEM WAS MODIFIED *****

      1TB: 333610/47407104 files (0.3% non-contiguous), 162180768/189607162 blocks

      Command exited with non-zero status 1

      7.34 user 0.86 system 0:33.89 elapsed 24%CPU (0avgtext+0avgdata 418704maxresident)k

      925956inputs+48920outputs (6major+32321minor)pagefaults 0swaps

      It took less than 34 seconds to run. The drive used was a 1TB WD "Black Drive." Nothing too special.

      Reply
  • Read More
    nstenzWhitson Gordon
    7/02/11 2:06am

    There are Linux viruses, and if you pay any attention at all to the change logs shown for web browser updates in the update manager, you'll see regular fixes for security vulnerabilities that could compromise your system. Ignore this warning at your own peril.

    "apt-get update && apt-get dist-upgrade" is also the Linux equivalent of, "Check for updates, and automatically upgrade my install of XP to Vista when it comes out, too."

    "dist-upgrade" doesn't only install patches of your current programs; "apt-get upgrade" would do that.

    If stuff is working for you in, say, Ubuntu 10.10, and dist-upgrade bumps you to 11.04 and stuff breaks, you're not going to be a happy camper. I prefer to watch the tubes for reviews of the new version for at least a couple of weeks to see whether there's any changes that may bite me in the rear. Ubuntu regularly rolls out half-baked "features" in new releases and spends the next year or two working the kinks out of them. Those of us who have been using it long enough to have the "pleasure" of being guinea pigs for the initial roll out of pulseaudio, for example, can tell you all about it...

    Reply
    • Read More
      nstenznstenz
      7/02/11 2:27am

      OK; I was incorrect. Debian distros won't jump releases unless you also run do-release-upgrade or edit your sources.list file.

      dist-upgrade WILL "remove obsolete packages from your system" in PREPARATION for a new release. On distros like Ubuntu, this could potentially remove a somewhat functional package you still use.

      I wish I could edit my previous comment!

      Reply
  • Read More
    digital_manWhitson Gordon
    7/01/11 5:52pm

    Very nice article and well stated about the Graphics Driver Updates.

    I have an ATI video card, and when I first installed Ubuntu, it wanted me to update the drivers for it - so I did. After that, all the video files I played back were no longer smooth. I re installed Ubuntu, did NOT update the Graphics Drivers, and my videos playback fine and smoothly.

    Reply
    • Read More
      avengingwatcherdigital_man
      7/02/11 1:53am

      You probably got some older ATI drivers, which the newer open source drivers have surpassed in many ways though there may be some very specific instances to use the older ones, though I never found any. If you switch to an NVidia don't be shocked if plymouth doesn't work though, because plymouth doesn't work with current Nvidia drivers.

      Reply
  • Read More
    GhostLyricsWhitson Gordon
    7/01/11 2:27pm

    About nVidia: Ubuntu uses this for its nVidia stuff [en.wikipedia.org] / also they have a PPA which ships more up to date official drivers. I don't know about Debian however. I don't think they use DKMS.

    Reply
    • Read More
      IdleThreatsGhostLyrics
      7/01/11 2:59pm

      I just get my installer from the source- Nvidia. Kill your DE(Gnome/KDE/etc.) then stop it, run the graphics driver installer, assuming it finishes start your DE, I then like rebooting to ensure all is well and you're good to go. This method works on just about all Debian and Debian based systems.

      Reply
    • Read More
      GhostLyricsIdleThreats
      7/22/11 3:20pm

      PPAs are just great for lazy people. Like me. Also this is the PPA containing the drivers. (includes description)

      Reply
  • Read More
    fuhquedWhitson Gordon
    7/01/11 5:54pm

    Linux is awesome and I would use it all of the time... but trying to run high intensity mmorpgs under Wine just sucks. :( I have a good graphics card now, hate to see that wasted by running stuff on Wine. Other than that, I'd use Linux Mint all the time if Linux were up to par when it comes to gaming.

    Reply
    • Read More
      Blankfuhqued
      7/02/11 4:47am

      Yep. It's almost better to virtualize tha to use wine, wine is good for 'light' programms but games and such just suck too much power.

      Reply
    • Read More
      fuhquedBlank
      7/03/11 2:02am

      I'm not sure of the specifics why they have so much of a problem on Linux and Wine, but one would think it wouldn't be too terribly hard to fix. =/

      Reply
  • Read More
    WipeoutWhitson Gordon
    7/01/11 1:08pm

    Not a Linux user, but if there's one thing I've learned hanging out here at Lifehacker, it's that @TheFu knows his stuff. Nice article.

    Reply
    • Read More
      SavageRehabWipeout
      7/01/11 2:52pm

      +1, I have learnt so much from just reading comments that TheFu has posted. And for that I thank him.

      Reply