Linux one liners
Diff two directories
diff -rq /dir1 /dir2
Set suid
File: File executes as the user that owns the file, not the user that ran the file
chmod u+s
Set sgid
File: File executes as the group that owns the file
Dir: Files newly created in the directory have their group owner set to match the group owner of the directory
chmod u+s
chmod g+s
Set Sticky bit
File: no effect
Dir: User with write on the directory can only remove files that they own, they can not remove files owned by other users
chmod o+t
Tar + gzip on the fly in linux
# tar cvf - somedir | gzip -c > somedir.tar.gz
List all ips on ifconfig except loopback
ifconfig | grep 'inet addr:'| grep -v '127.0.0.1' | cut -d: -f2 | awk '{ print $1}' | uniq
List sockets to Listen on in httpd.conf
cat /path/to/httpd.conf | grep -i listen | grep -v "#" | cut -d: -f1 | awk '{print $2}' | sort -n | uniq
Find ips set to listen on httpd.conf
cat /path/to/httpd.conf | grep -i listen | grep -v "#" | cut -d: -f1 | awk '{print $2}' | sort -n | uniq
Find sockets set to listen in all files under a directory
grep -Ri listen /path/to/vhosts | grep -v "#" | awk '{print $2}' | sort -n |uniq
Find ips set to listen in all files under a directory
grep -Ri listen /path/to/vhosts | grep -v "#" | awk '{print $2}' | sort -n | cut -d: -f1 |uniq
find all ips set to listen in all httpd.conf files
locate httpd.conf | xargs grep -i listen | grep -v "#" | awk '{print $2}' | sort -n | cut -d: -f1 | uniq
Scripts
some useful linux/unix scripts
linux dedupe problems?
Here’s a nice dedupe script i found:
#!/bin/bash # Filename: dedupe.sh # source: http://ithacafreesoftware.org/forum/viewtopic.php?f=7&t=326 # this file takes two text files as input # sorts them and outputs lines from # file 2 that do not exist in file 1 # into a new file called [file 1].clean if [ -f "$1" ] && [ -f "$2" ] then # sort both files sort -u $1 > $1.tmp; sort -u $2 > $2.tmp; comm -13 $1.tmp $2.tmp > $2.clean; else echo "Usage: dedupe.sh file1 file2"; echo "where file1 is the 'master' file"; echo "and file2 is the file possibly containing duplicates"; fi