Jump to content


Photo
* * * * * 3 votes

Shell tricks


  • Please log in to reply
83 replies to this topic

#61 mobilec

mobilec

    SCRiPT KiDDie

  • Members
  • 25 posts
  • Location:Ottawa, Canada

Posted 17 February 2008 - 07:03 PM

Try echo -n >file instead. No newline.

touch file


#62 duper

duper

    Dangerous free thinker

  • Members
  • 816 posts
  • Location:NYC

Posted 17 February 2008 - 07:15 PM

Try echo -n >file instead. No newline.

touch file


>file

Edited by duper, 17 February 2008 - 07:16 PM.


#63 mirrorshades

mirrorshades

    aviatorglasses

  • Agents of the Revolution
  • 951 posts
  • Gender:Male

Posted 17 April 2008 - 02:49 PM

Okay... this may be a trick, because it's not in the manpage and I've not really seen it listed anywhere else. :)

To quickly move back and forth between two directories, you can use a single hyphen (-) after the "cd" command. Basically, it takes you back to the directory you were in immediately before the current one:

root@nato [/]# cd /etc/ssl
root@nato [/etc/ssl]# cd /
root@nato [/]# cd -
/etc/ssl
root@nato [/etc/ssl]# cd -
/
root@nato [/]#


#64 PurpleJesus

PurpleJesus

    Dangerous free thinker

  • Members
  • 1,578 posts
  • Gender:Male
  • Location:800

Posted 17 April 2008 - 08:46 PM

Okay... this may be a trick, because it's not in the manpage and I've not really seen it listed anywhere else. :)

To quickly move back and forth between two directories, you can use a single hyphen (-) after the "cd" command. Basically, it takes you back to the directory you were in immediately before the current one:

root@nato [/]# cd /etc/ssl
root@nato [/etc/ssl]# cd /
root@nato [/]# cd -
/etc/ssl
root@nato [/etc/ssl]# cd -
/
root@nato [/]#



cool... it works..

#65 inaequitas

inaequitas

    SUP3R 31337

  • Members
  • 158 posts

Posted 19 April 2008 - 10:24 AM

Okay... this may be a trick, because it's not in the manpage and I've not really seen it listed anywhere else. :)

To quickly move back and forth between two directories, you can use a single hyphen (-) after the "cd" command. Basically, it takes you back to the directory you were in immediately before the current one:

root@nato [/]# cd /etc/ssl
root@nato [/etc/ssl]# cd /
root@nato [/]# cd -
/etc/ssl
root@nato [/etc/ssl]# cd -
/
root@nato [/]#


If you want to keep track of multiple folders you've been through, try pushd / popd. You basically get a 'directory stack' to push and pop from at your heart's desire.

#66 .solo

.solo

    Gibson Hacker

  • Members
  • 80 posts

Posted 19 April 2008 - 04:41 PM

Often forget to use sudo and then having to up-arrow and add it is a bitch. Instead do a
[codebox]sudo !![/codebox]
!! - denotes last command in history
[codebox]echo $?[/codebox]
returns the exit code of the last command, which is often 0 for successful or 1 for failure. perhaps someone could elaborate.
[codebox]cd ~[/codebox]
~ - denotes home directory. again not a groundbreaking trick just something beginners might find useful

Edited by .solo, 19 April 2008 - 04:44 PM.


#67 .solo

.solo

    Gibson Hacker

  • Members
  • 80 posts

Posted 19 April 2008 - 04:47 PM

why the fuck are my code boxes HUGE!?

#68 Dirk Chestnut

Dirk Chestnut

    SUP3R 31337 P1MP

  • Members
  • 268 posts
  • Location:248

Posted 11 June 2008 - 10:14 PM

Here's a yum specific trick:

I typically work with Redhat-related (Redhat, Fedora, CentOS, etc) systems, and use yum as my desired mechanism for managing installation/dependencies of packages. I also do a lot of dev work in Perl, which means installing lots of modules from CPAN. If the distro provides one, the sanest way for me to install packages is usually through my distro's repositories. This isn't always the easiest to do, though, as packages are rarely named Some::Module in the repos, nor are they always called perl-Some-Module.

Take the following example (from Fedora 8) for a common module, LWP::Simple:

$ yum search LWP::Simple
No Matches found
$ yum search perl-LWP-Simple
No Matches found

It's not that Fedora 8 doesn't have it's own package for LWP::Simple, it's just that it's bundled with something else, and named in a way you wouldn't necessarily expect. If you instead run this command:

$ yum whatprovides 'perl(LWP::Simple)'
perl-libwww-perl.noarch : A Perl interface to the World-Wide Web

Voila! We now know that 'perl-libwww-perl' is actually the package responsible for the LWP::Simple module.

#69 Baphomet

Baphomet

    Will I break 10 posts?

  • Members
  • 6 posts

Posted 07 July 2008 - 01:28 PM

Here's my "trick"

$? holds the return value for the last command. E.g.

$ curl urlthatdoesntexist.com
curl: (6) Couldn't resolve host 'urlthatdoesntexist.com'
$ echo $?
6
$ curl www.google.com > /dev/null
  % Total	% Received % Xferd  Average Speed   Time	Time	 Time  Current
								 Dload  Upload   Total   Spent	Left  Speed
100  6538	0  6538	0	 0  18957	  0 --:--:-- --:--:-- --:--:-- 55232
$ echo $?
0

EDIT: Oops, just realized this has already been covered, hopefully my elaboration will add something ;)

pushd stores the current direction and changes to a new one. popd returns to the last stored directory.
$ pwd
/home/baphomet
$ pushd /
/ ~
$ pwd
/
$ popd
~
$ pwd
/home/baphomet

Edited by Baphomet, 07 July 2008 - 01:32 PM.


#70 Spyril

Spyril

    Hakker addict

  • Members
  • 588 posts
  • Location:North Dakota

Posted 10 July 2008 - 10:00 PM

View most-used commands:
history | awk '{print $2}' | awk 'BEGIN {FS="|"} {print $1}'|sort|uniq -c | sort -n | tail | sort -nr


#71 Ohm

Ohm

    I could have written a book with all of these posts

  • Members
  • 3,209 posts
  • Gender:Male
  • Location:Maine, USA

Posted 10 July 2008 - 10:55 PM

View most-used commands:

history | awk '{print $2}' | awk 'BEGIN {FS="|"} {print $1}'|sort|uniq -c | sort -n | tail | sort -nr


I think you took a few extra steps in there. The second awk doesn't do anything, only the first word of the command will get through the first awk, and there aren't bound to be any pipe characters in that so the second awk is essentially a no-op. You also do an extra sort.

history | awk '{print $2}' | sort | uniq -c | sort -nr | head


#72 Alk3

Alk3

    "I Hack, therefore, I am"

  • Binrev Financier
  • 1,003 posts
  • Gender:Not Telling
  • Location:312 Chi-town

Posted 24 July 2008 - 09:11 PM

At work I had to delete email for about 50 users on a server running courier imap as the inbound email service. Somehow they had over 100GB of email, so I was told to delete it all, but avoid destroying the Maildir directories. It uses system commands, so it has the inability to crash unless there is a big problem with the OS, but runs slower than a script in perl/python/ruby.

localhost:~$ cat read.sh 
			  #!/bin/bash
			  
			  workingdir='/tmp/test'
			  cd $workingdir
			  
			  array=(`ls`)
			  len=${#array[*]}
			  i=0
			  
			  while [ $i -lt $len ]; do
					  echo "${array[$i]}" >> $HOME/output.txt
					  let i++
			  done
			  
			  cat $HOME/output.txt | while read line; do
					  ls $workingdir/"${line}"/cur/* >> $HOME/result.txt  #change this to ls $workingdir/"${line}"/cur/* | xargs rm -rf
			  done

This will work for any scenario that you want to delete large quantities of files with a specific directory structure. I thought I would post this because it contains quite a few shell tricks.

You can play with this if you make a directory structure in /tmp like this:

localhost:~$ ls /tmp/test/*/*
/tmp/test/1/cur:
file

/tmp/test/2/cur:
file

/tmp/test/3/cur:
file

/tmp/test/4/cur:
file



#73 Ohm

Ohm

    I could have written a book with all of these posts

  • Members
  • 3,209 posts
  • Gender:Male
  • Location:Maine, USA

Posted 25 July 2008 - 12:36 AM

You just want to rm -Rf all files under $workingdir/"${line}"/cur/? That's a one liner, why is this a 20 liner?

#74 Alk3

Alk3

    "I Hack, therefore, I am"

  • Binrev Financier
  • 1,003 posts
  • Gender:Not Telling
  • Location:312 Chi-town

Posted 25 July 2008 - 05:55 PM

You just want to rm -Rf all files under $workingdir/"${line}"/cur/? That's a one liner, why is this a 20 liner?


Because I was working with a directory structure that has about 500 email accounts. There are quite a few things left out of this script to avoid posting SQL scripts for a propriatery software company on a public forum.

The directory structure is like this: /var/mail/<customerid>/Maildir/cur

Where the customer id is pulled from a SQL database and inputted into my bash script that deletes the contents of the person(s) inbox. What decided the customer id in my case was inactive customers that had email disabled due to negligence of some type: spam, porn, large attachments, you name it.

#75 Ohm

Ohm

    I could have written a book with all of these posts

  • Members
  • 3,209 posts
  • Gender:Male
  • Location:Maine, USA

Posted 25 July 2008 - 10:34 PM

You just want to rm -Rf all files under $workingdir/"${line}"/cur/? That's a one liner, why is this a 20 liner?


Because I was working with a directory structure that has about 500 email accounts. There are quite a few things left out of this script to avoid posting SQL scripts for a propriatery software company on a public forum.

The directory structure is like this: /var/mail/<customerid>/Maildir/cur

Where the customer id is pulled from a SQL database and inputted into my bash script that deletes the contents of the person(s) inbox. What decided the customer id in my case was inactive customers that had email disabled due to negligence of some type: spam, porn, large attachments, you name it.


Well with the interesting parts gutted, this script does essentially nothing. I'm just wondering why you posted it.

#76 L33T_j0sH

L33T_j0sH

    SUP3R 31337

  • Members
  • 185 posts
  • Location:ALABAMA

Posted 12 October 2008 - 09:50 PM

Wow... I didn't know about the Alias command. I have already used it multiple times since I read this thread today. Very useful command.

#77 duper

duper

    Dangerous free thinker

  • Members
  • 816 posts
  • Location:NYC

Posted 13 October 2008 - 01:51 PM

This generates a list of resolved Linux virtual hosts on the fly:

#!/bin/sh
for i in `/sbin/ifconfig -a|grep 'inet addr'|awk -F":" '{print $2}'|awk '{print $1}'`;do host $i;done 2>/dev/null

Can anybody come up with anything that uses a different approach or is more concise?

#78 MrFluffy

MrFluffy

    HACK THE PLANET!

  • Validating
  • 68 posts
  • Country:
  • Gender:Male
  • Location:somewhere

Posted 23 April 2010 - 06:50 AM

You just want to rm -Rf all files under $workingdir/"${line}"/cur/? That's a one liner, why is this a 20 liner?


Because I was working with a directory structure that has about 500 email accounts. There are quite a few things left out of this script to avoid posting SQL scripts for a propriatery software company on a public forum.

The directory structure is like this: /var/mail/<customerid>/Maildir/cur

Where the customer id is pulled from a SQL database and inputted into my bash script that deletes the contents of the person(s) inbox. What decided the customer id in my case was inactive customers that had email disabled due to negligence of some type: spam, porn, large attachments, you name it.

Well with the interesting parts gutted, this script does essentially nothing. I'm just wondering why you posted it.



Actually, although Im replying to prehistory, rm -rf , or rather the linux kernel has a hard limit to how many files it can remove. It spews back "/bin/rm: Argument list too long." when it exhausts the between process's execve buffer's in the kernel (its only 128k).

However, a simple solution is as below.

find . -name '*' -print0 | xargs -0 rm

Original info provided by John simpson, I just googled it when I hit the above limit one day.

http://leap-cf.org/o...May/038802.html

#79 Mikhail

Mikhail

    SCRiPT KiDDie

  • Members
  • 23 posts
  • Location:Long Beach, Ca

Posted 06 November 2010 - 03:49 PM

I/O redirection is what using < / > is called.

#80 n3xg3n

n3xg3n

    "I Hack, therefore, I am"

  • Members
  • 960 posts
  • Country:
  • Gender:Male
  • Location:(703)

Posted 21 November 2010 - 08:37 PM

If you make a mistake typing a command you can change something specific and then re-execute using the following bash trick:

[n3xg3n@enigma ~]$ wgt http://www.whatismyip.org
bash: wgt: command not found
[n3xg3n@enigma ~]$ ^wgt^wget
wget http://www.whatismyip.org
--2010-11-21 20:35:48--  http://www.whatismyip.org/
Resolving www.whatismyip.org...

basically, ^old^new will replace 'old' with 'new' in the previous command and re-execute.




BinRev is hosted by the great people at Lunarpages!