rmlint - find duplicate files and other space waste efficiently

To find duplicate files in test dir
$ pwd
/home/venus/test

create same content with 3 files
test$ ls
file1.txt  file2.txt  file3.txt

$ rmlint /home/venus/test
# Duplicate(s):
    ls '/home/venus/test/file1.txt'
    rm '/home/venus/test/file2.txt'
    rm '/home/venus/test/file3.txt'

==> Note: Please use the saved script below for removal, not the above output.
==> In total 3 files, whereof 2 are duplicates in 1 groups.
==> This equals 12 B of duplicates which could be removed.
==> Scanning took in total 0.146s.

Wrote a sh file to: /home/venus/test/rmlint.sh
Wrote a json file to: /home/venus/test/rmlint.json

2. To print help options
$ rmlint --help

3. To open rmlint in GUI 
$ rmlint --gui /home/venus/test



regards,
T.Dhanasekar