Notes‎ > ‎

Linux

The gadget spec URL could not be found
The gadget spec URL could not be found

Find recently modified files

# find all files modified upto 1 day ago
find path -mtime +1

# exclude cache and css
find . -mtime +1 | grep -v -e cache -e css

# only look for index.php
find . -name '*.php' -mtime +1

Find files containing a string

grep word /var/www -r -l

grep 'several words' /var/www -r -l

Remove BOM

find . -type f -exec sed 's/^\xEF\xBB\xBF//' -i.bak {} \; -exec rm {}.bak \;
# Removes the 3 byte-order-marks ef bb bf from all files in all folders


Find large files

ncdu is also a very nice utility for this, du with a front end that lets you delete

du -hs * | grep M | sort -rn | head -n 10

cd /
du -hs
(look for a big directory and cd into it)
(repeat)

You can pipe du into grep to find just the GB or MB sized directories by:

du -hs * | grep G
... or 
du -hs * | grep M

Example:

[root@tgn003 www]# cd /var/www/
[root@tgn003 www]# du -hs * | grep G
14G     sites

where a normal du shows:

[root@tgn003 www]# du -hs *
8.0K    cgi-bin
224K    error
19M     html
928K    icons
5.6M    manual
14G     sites
76K     usage


you can also get really tricky and....

[root@tgn003 var]# du -s * | sort -rn | head -n 10
14617056        www
4466176 lib
1009184 log
75460   cache
25012   spool
404     run
140     named
60      tmp
40      lock
32      empty

Which means
du -s *  == Disk Usage, Summerize output
sort -rn  == Sort, reverse order, numerical sorting
head -n 10  == Head of output, give me the top 10 lines

Find large folders

du --max-depth=0 * | sort -n

du -sh *

Find large files

# 10mb+ in current directory
find . -type f -size +10000k -exec ls -lh {} \; | awk '{ print $8 ": " $5 }'

Rsync

rsync -avrz --progress $file_1 $file_2_with_wildcards* destination

Files
rsync -avrz --progress root@stratics.com:/data/sites/war2 .

Database
rsync -avrz --progress war* ddo* root@tgn.tv:/var/lib/mysql

Allow PHP to sudo commands

# visudo

Add
www-data ALL = NOPASSWD : ALL

Now PHP can

<?php
exec("sudo /etc/init.d/apache2 restart", $output);
echo implode("\n", $output);
?>

Firewall whitelist
List IPs
iptables -vL
Whitelist IPs to SFTP firewall

cd /etc
vim firewall.conf
iptables-restore < /etc/firewall.conf

To add more IP's to the port-22 allow list, edit /etc/firewall.conf and add the entries in there

Then apply the firewall settings
iptables-restore < /etc/firewall.conf

Whitelist IP to port 22

Look at denied sessions
tail -f /var/log/auth.log

vim /var/lib/denyhosts/allowed-hosts
/etc/init.d/denyhosts restart

IP should now be removed from
/etc/hosts.deny

Active sessions

Login times
# finger

Active domains
# who

Active IPs
# who --ips

Add users

adduser $user

vim /etc/group
Add $user after last : in www-data:x:33:

vim /etc/passwd
Change second number to 33 (www-data id)

cd /path/in/question
chown -hR $user:www-data .

Upgrade Java

apt-cache search java

apt-get update

apt-get install openjdk-6-jre-headless
-or-
apt-get install sun-java6-jre

update-java-alternatives --list
update-java-alternatives --set java-6-openjdk
^-- Or whatever you like!


Transfer files

wget -r ftp://username:password@ip.of.old.host

wget -m ... (mirror)

Compress

# good compression, fast
tar -zcvf archive.tar.gz path

# extract
tar -zxvf archive.tar.gz


# best compression, slow
tar -jcvf archive.tar.bz2 path

# extract
tar -jxvf archive.tar.bz2


# no compression
tar -cvf archive.tar path

# extract
tar -xvf archive.tar


# zip, well-known but bad compression
zip -r archive.zip path

# unzip
unzip archive.zip

Disk info

$ df
File system disk space usage

$ df -h
Human readable (gb)

Memory info

$ free -m
Free memory in mb

Find and replace over all files


Recursively

find ./ -type f -exec sed -i 's/string1/string2/' {} \;


In same folder

perl -pi -w -e 's/search/replace/g;' *.php
-e means execute the following line of code.
-i means edit in-place
-w write warnings
-p loop

See
http://www.liamdelahunty.com/tips/linux_search_and_replace_multiple_files.php

Find a file

find /path/ -name file.ext

find /path/ -iname file.ext
case-insensitive file

find . -iname .htaccess 2> /dev/null
hide errors

find . -iname '*whatever*' | grep -v 'regex_to_match_your_errors'
hide lines that contain text

Find a string in a file

egrep -r 'regex' ./dir_to_search
extended regular expression mode, supports more regexp syntax

egrep equivalent to grep -E

egrep -ir 'regex' ./dir_to_search_recursively
case insensitive

find . -iname '.htaccess' -exec grep -iE 'extended_regex' {} \;
{} placeholder
\; finishing argument in exec

find . -iname '.htaccess' -exec grep -iHE 'extended_regex' {} \;
-H show filename

Find a file containing a string

$ grep -H -r “string" /path
-r recurses into all subdirectories
-H prints the filename for each match


$ find /path/ -name file.ext -exec grep "string" {} \; -print

Find files modified since

find . -iname '*.php' -mtime -4 -print 2> /dev/null

Find which cron jobs are running

crontab -l

Restart Apache

/path/to/apache/bin/apachectl restart

Path

pwd
present working directory

Copy folder

cp -r folder destination/

Zip

zip -r file.zip path/*

Monitor slow MySQL queries

tail -f /var/log/mysql/mysql-slow.log

Monitors the file for appends and displays them

Works on any log file at all

tail -f is one of those necessary tools

Comments