Strange space consumption on file-system

Hello,
I have a x86 Solaris server running on VMWare. c1t0d0 is root disk of 40 GB. I am not able to find, where space is being consumed. It just available space is 2.6 GB only. There is no quota or reservation set. Can somebody give me some pointer to fix it ?

-bash-3.2# zpool list
NAME                 SIZE  ALLOC   FREE  CAP  HEALTH  ALTROOT
pbdbm-wst-adm1      19.9G  11.3G  8.56G  56%  ONLINE  -
pbdbm-wst-esrpapp1  19.9G  1.90G  18.0G   9%  ONLINE  -
pbdbm-wst-lngapp1   19.9G  1.99G  17.9G  10%  ONLINE  -
pbdbm-wst-lpgapp1   19.9G  1.57G  18.3G   7%  ONLINE  -
pbdbm-wst-lsrgapp1  19.9G  1.44G  18.4G   7%  ONLINE  -
rpool               39.8G  36.5G  3.26G  91%  ONLINE  -
-bash-3.2# zfs list | grep -i rpool
rpool                           36.5G  2.64G  42.5K  /rpool
rpool/ROOT                      12.1G  2.64G    31K  legacy
rpool/ROOT/s10x_u11wos_24a      12.1G  2.64G  6.45G  /
rpool/ROOT/s10x_u11wos_24a/var  5.63G  2.64G  5.63G  /var
rpool/dump                      2.00G  2.64G  2.00G  -
rpool/export                     184K  2.64G    32K  /export
rpool/export/home                152K  2.64G   152K  /export/home
rpool/shared                    15.9G  2.64G  15.9G  /zones/shared
rpool/swap                      6.50G  2.64G  6.50G  -
-bash-3.2#
-bash-3.2#
-bash-3.2# df -h / /var /usr /opt
Filesystem             size   used  avail capacity  Mounted on
rpool/ROOT/s10x_u11wos_24a
                        39G   6.4G   2.6G    71%    /
rpool/ROOT/s10x_u11wos_24a/var
                        39G   5.6G   2.6G    69%    /var
rpool/ROOT/s10x_u11wos_24a
                        39G   6.4G   2.6G    71%    /
rpool/ROOT/s10x_u11wos_24a
                        39G   6.4G   2.6G    71%    /
-bash-3.2#
-bash-3.2# du -sh /var /usr /opt
 5.6G   /var
 5.7G   /usr
 313M   /opt
-bash-3.2# zfs get quota,reservation | grep -i rpool
rpool                           quota        none    default
rpool                           reservation  none    default
rpool/ROOT                      quota        none    default
rpool/ROOT                      reservation  none    default
rpool/ROOT/s10x_u11wos_24a      quota        none    default
rpool/ROOT/s10x_u11wos_24a      reservation  none    default
rpool/ROOT/s10x_u11wos_24a/var  quota        none    default
rpool/ROOT/s10x_u11wos_24a/var  reservation  none    default
rpool/dump                      quota        -       -
rpool/dump                      reservation  none    default
rpool/export                    quota        none    default
rpool/export                    reservation  none    default
rpool/export/home               quota        none    default
rpool/export/home               reservation  none    default
rpool/shared                    quota        none    default
rpool/shared                    reservation  none    default
rpool/swap                      quota        -       -
rpool/swap                      reservation  none    default
-bash-3.2#
-bash-3.2# zfs list -o space | grep -i rpool
rpool                           2.64G  36.5G         0   42.5K              0      36.5G
rpool/ROOT                      2.64G  12.1G         0     31K              0      12.1G
rpool/ROOT/s10x_u11wos_24a      2.64G  12.1G         0   6.45G              0      5.63G
rpool/ROOT/s10x_u11wos_24a/var  2.64G  5.63G         0   5.63G              0          0
rpool/dump                      2.64G  2.00G         0   2.00G          4.20M          0
rpool/export                    2.64G   184K         0     32K              0       152K
rpool/export/home               2.64G   152K         0    152K              0          0
rpool/shared                    2.64G  15.9G         0   15.9G              0          0
rpool/swap                      2.64G  6.50G         0   6.50G              0          0
-bash-3.2#

Thanks

Do you have snapshots?

No, not boot environment too.

-bash-3.2# zfs list -t snapshot
no datasets available
-bash-3.2# lustatus
ERROR: No boot environments are configured on this system
ERROR: cannot determine list of all boot environment names
-bash-3.2#

What's the output from

zfs list -t all

and

zpool status -v

along with

df -h

with no arguments

Here you go -

-bash-3.2# zpool list
NAME                 SIZE  ALLOC   FREE  CAP  HEALTH  ALTROOT
pbdbm-wst-adm1      19.9G  11.3G  8.55G  57%  ONLINE  -
pbdbm-wst-esrpapp1  19.9G  1.91G  18.0G   9%  ONLINE  -
pbdbm-wst-lngapp1   19.9G  2.00G  17.9G  10%  ONLINE  -
pbdbm-wst-lpgapp1   19.9G  1.58G  18.3G   7%  ONLINE  -
pbdbm-wst-lsrgapp1  19.9G  1.45G  18.4G   7%  ONLINE  -
rpool               39.8G  35.0G  4.80G  87%  ONLINE  -
-bash-3.2#
-bash-3.2#
-bash-3.2#
-bash-3.2# zfs list -t all
NAME                             USED  AVAIL  REFER  MOUNTPOINT
pbdbm-wst-adm1                  11.3G  8.24G  11.3G  /zones/pbdbm-wst-adm1
pbdbm-wst-esrpapp1              1.91G  17.7G  1.91G  /zones/pbdbm-wst-esrpapp1
pbdbm-wst-lngapp1               2.00G  17.6G  2.00G  /zones/pbdbm-wst-lngapp1
pbdbm-wst-lpgapp1               1.58G  18.0G  1.58G  /zones/pbdbm-wst-lpgapp1
pbdbm-wst-lsrgapp1              1.45G  18.1G  1.45G  /zones/pbdbm-wst-lsrgapp1
rpool                           35.0G  4.17G    42K  /rpool
rpool/ROOT                      10.5G  4.17G    31K  legacy
rpool/ROOT/s10x_u11wos_24a      10.5G  4.17G  6.10G  /
rpool/ROOT/s10x_u11wos_24a/var  4.44G  4.17G  4.44G  /var
rpool/dump                      2.00G  4.18G  2.00G  -
rpool/export                     184K  4.17G    32K  /export
rpool/export/home                152K  4.17G   152K  /export/home
rpool/shared                    15.9G  4.17G  15.9G  /zones/shared
rpool/swap                      6.50G  4.17G  6.50G  -
-bash-3.2#
-bash-3.2# zpool status -v
  pool: pbdbm-wst-adm1
 state: ONLINE
 scan: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        pbdbm-wst-adm1  ONLINE       0     0     0
          c1t1d0    ONLINE       0     0     0

errors: No known data errors

  pool: pbdbm-wst-esrpapp1
 state: ONLINE
 scan: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        pbdbm-wst-esrpapp1  ONLINE       0     0     0
          c1t2d0    ONLINE       0     0     0

errors: No known data errors

  pool: pbdbm-wst-lngapp1
 state: ONLINE
 scan: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        pbdbm-wst-lngapp1  ONLINE       0     0     0
          c1t3d0    ONLINE       0     0     0

errors: No known data errors

  pool: pbdbm-wst-lpgapp1
 state: ONLINE
 scan: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        pbdbm-wst-lpgapp1  ONLINE       0     0     0
          c1t4d0    ONLINE       0     0     0

errors: No known data errors

  pool: pbdbm-wst-lsrgapp1
 state: ONLINE
 scan: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        pbdbm-wst-lsrgapp1  ONLINE       0     0     0
          c1t5d0    ONLINE       0     0     0

errors: No known data errors

  pool: rpool
 state: ONLINE
 scan: none requested
config:

        NAME        STATE     READ WRITE CKSUM
        rpool       ONLINE       0     0     0
          c1t0d0s0  ONLINE       0     0     0

errors: No known data errors
-bash-3.2#
-bash-3.2#
-bash-3.2# df -h
Filesystem             size   used  avail capacity  Mounted on
rpool/ROOT/s10x_u11wos_24a
                        39G   6.1G   4.2G    60%    /
/devices                 0K     0K     0K     0%    /devices
ctfs                     0K     0K     0K     0%    /system/contract
proc                     0K     0K     0K     0%    /proc
mnttab                   0K     0K     0K     0%    /etc/mnttab
swap                    32G   904K    32G     1%    /etc/svc/volatile
objfs                    0K     0K     0K     0%    /system/object
sharefs                  0K     0K     0K     0%    /etc/dfs/sharetab
/usr/lib/libc/libc_hwcap1.so.1
                        10G   6.1G   4.2G    60%    /lib/libc.so.1
fd                       0K     0K     0K     0%    /dev/fd
rpool/ROOT/s10x_u11wos_24a/var
                        39G   4.4G   4.2G    52%    /var
swap                    32G     0K    32G     0%    /tmp
swap                    32G    52K    32G     1%    /var/run
rpool/export            39G    32K   4.2G     1%    /export
rpool/export/home       39G   152K   4.2G     1%    /export/home
rpool                   39G    42K   4.2G     1%    /rpool
pbdbm-wst-adm1          20G    11G   8.2G    58%    /zones/pbdbm-wst-adm1
pbdbm-wst-esrpapp1      20G   1.9G    18G    10%    /zones/pbdbm-wst-esrpapp1
pbdbm-wst-lngapp1       20G   2.0G    18G    11%    /zones/pbdbm-wst-lngapp1
pbdbm-wst-lpgapp1       20G   1.6G    18G     9%    /zones/pbdbm-wst-lpgapp1
pbdbm-wst-lsrgapp1      20G   1.5G    18G     8%    /zones/pbdbm-wst-lsrgapp1
rpool/shared            39G    16G   4.2G    80%    /zones/shared
-bash-3.2#

Filesystem rpool/shared is occupying most space in rpool.

rpool/shared            39G    16G   4.2G    80%    /zones/shared

Hope that helps
Regards
Peasant.

1 Like

Oh..I completely forgot about that. Thanks for pointing it out