passwd command

12.05.2009

13>passwd command

SUMMARY:

Allows you to change your password.

SYNTAX:

-r      Specifies the repository to which an operation is applied. The supported repositories are files, nis or nisplus.
-a     Show password attributes for all entries. Use only with the -s option; name must not be provided. For the nisplus repository, this will show only the entries in the NIS+ password table in the local domain that the invoker is authorized to “read”. For the files repository, this is restricted to the superuser.
-d     Deletes password for name. The login name will not be prompted for password. It is only applicable to the files repository.
-l     Locks password entry for name.
-e     Change the login shell. For the files repository, this only works for the super-user. Normal users may change the nis or nisplus repositories. The choice of shell is limited by the requirements of getusershell(3C). If the user currently has a shell that is not allowed by getusershell , only root may change it.
-f     Force the user to change password at the next login by expiring the password for name.

EXAMPLES:

passwd  – entering just passwd would allow you to change the password. After entering passwd you will receive the following three prompts:

Current Password:
New Password:
Confirm New Password:

Each of these prompts must be entered and entered correctly for the password to be successfully changed.

Advertisements

wc command

04.05.2009

12>wc command

SUMMERY:

The wc (word count) command is a very simple utility found in all Unix variants. Its purpose is counting the number of lines, words and characters of text files. If multiple files are specified, wc produces a count for each file, plus totals for all files.

When used without options wc prints the number of lines, words and characters, in that order. A word is a sequence of one or more characters delimited by whitespace. If we want fewer than the three counts, we use options to select what is to be printed: -l to print lines, -w to print words and -c to print characters. The GNU version of wc found in Linux systems also supports the long options format: –chars (or –bytes), –words, –lines.

SYNTAX:

wc [-c | -m | -C ] [-l] [-w] [ file … ]
-c     Count bytes.
-m     Count characters.
-C     Same as -m.
-l     Count lines.
-w     Count words delimited by white space characters or new line characters. Delimiting characters are Extended Unix Code (EUC) characters from any code set defined by iswspace()
file     Name of file to word count.

EXAMPLES:

If we want to count how many words are in line 70 of file foo.txt then we use:

head -70 foo.txt | tail -1 | wc -w

Here, the command head -70 outputs the first 70 lines of the file, the command tail -1 (i.e., the number 1) outputs the last line of its input, which happens to be line 70 of foo.txt, and wc counts how many words are in that line.

wget command

01.04.2009

11>wget command

SUMMERY:

There are many ways to download files. But there is only one smart way to download from the command line – wget. The wget tool is a non-interactive network download tool that can download single files, recursively download entire directories, and even follow links.
SYNTAX:

-V
–version
Display the version of Wget.
-h
–help
Print a help message describing all of Wget’s command-line options.
-b
–background
Go to background immediately after startup. If no output file is specified via the -o, output is redirected to wget-log.
-e command
–execute command
Execute command as if it were a part of .wgetrc. A command thus invoked will be executed after the commands in .wgetrc, thus taking precedence over them.

EXAMPLES:

Download a single file using wget

$ wget http://www.cyberciti.biz/here/lsst.tar.gz
$ wget ftp://ftp.freebsd.org/pub/sys.tar.gz

Download multiple files on command line using wget

://ftp.redhat.com/pub/xyz-1rc-i386.rpmOR

i) Create variable that holds all urls and later use ‘BASH for loop’ to download all files:
$ URLS=”http://www.cyberciti.biz/download/lsst.tar.gz ftp://ftp.freebsd.org/pub/sys.tar.gz ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm http://xyz.com/abc.iso" ii) Use for loop as follows:
$ for u in $URLS; do wget $u; doneiii) However, a better way is to put all urls in text file and use -i option to wget to download all files:

(a) Create text file using vi
$ vi /tmp/download.txtAdd list of urls:
http://www.cyberciti.biz/download/lsst.tar.gz
ftp://ftp.freebsd.org/pub/sys.tar.gz
ftp://ftp.redhat.com/pub/xyz-1rc-i386.rpm
http://xyz.com/abc.iso
(b) Run wget as follows:
$ wget -i /tmp/download.txt(c) Force wget to resume download
You can use -c option to wget. This is useful when you want to finish up a download started by a previous instance of wget and the net connection was lost. In such case you can add -c option as follows:
$ wget -c http://www.cyberciti.biz/download/lsst.tar.gz
$ wget -c -i /tmp/download.txt
Please note that all ftp/http server does not supports the download resume feature.

Force wget to download all files in background, and log the activity in a file:

$ wget -cb -o /tmp/download.log -i /tmp/download.txtOR$ nohup wget -c -o /tmp/download.log -i /tmp/download.txt &nohup runs the given COMMAND (in this example wget) with hangup signals ignored, so that the command can continue running in the background after you log out.

See man page of wget for more advanced options.