Disk space checker linux
Df -H df -si, you will find that numbers such.9G become.1G.
Easier Method with ncdu The ncdu command is a lot easier way to do all of the above.
Running out of disk space isn't the only problem you might face when running a Linux system.Df swf avi gif converter 2.3 crack serial keygen -Th, when figuring out disk space issues it can be important to know what type of file system you're dealing with to plan accordingly.Let's take a closer look at the entire file system.Sqlite -rw-rr 1 root root Dec 4 16:52 -rw-rr 1 root root Dec 17 19:46 -rw-rr 1 root root Jan 1 18:05 -rw-rr 1 root root Jan 14 21:24 -rw-rr 1 root root Jan 24 18:04 -rw-rr 1 dev dev Nov 8 14:40 -rw-rr.Btrfs fi df /device/ - Show disk space usage information for a btrfs based mount point/file system.
For this we can use du and ncdu.
Don't go straight to du /.
When I run this on my system I get the following output: [email protected] var# du -a /var sort -nr head -n /var 3473128 /var/cache 3470188 /var/cache/yum 572308 /var/cache/yum/x86_ /var/cache/yum/x86_64/6Server 375388 /var/log 373056 /var/cache/yum/rhel6-auto /var/cache/yum/rhel6-auto /var/cache/yum/rhel6-auto /var/cache/yum/rhel6-auto This shows me that the largest items are the.For example to list files which are anywhere between 1 GB and 5 GB you can do this: du -ch /home grep '0-5G' So with these few commands youre well equipped to relatively quickly hunt down those big bad files and directories on your file.This will take you some time, but unless you have"s set up, I think that's just the way it's going.You may wonder why we use 1024 and not 1000.The output will look something like this: 39G total 39G.You can, however, create hard links between files which also use inodes.Like before, let's break it down.If you found large database files and removed them you could produce catastrophic consequences.
This will mostly apply if youre running a Linux server.
Head -n 10 will limit the number of results to the top ten largest directories.