Saturday, August 11, 2007
CVS Server How-To
http://greenisland.csie.nctu.edu.tw/wp/2005/06/16/26/
1. Install CVS
2. vi /etc/services
should have these two lines:
cvspserver 2401/tcp # cvs client/server operations
cvspserver 2401/udp # cvs client/server operations
3. Add group and user
> groupadd cvs
> adduser cvsroot
> passwd cvsroot
4. Change permission of /home/cvsroot
> chown cvsroot.cvs /home/cvsroot
> chmod 755 /home/cvsroot
5. xinetd settings
> cd /etc/xinetd.d
> vi cvspserver
service cvspserver
{
disable = no
flags = REUSE
socket_type = stream
wait = no
user = root
server = /usr/bin/cvs
server_args = --allow-root=/home/cvsroot pserver
log_on_failure += USERID
}
6. Initialize CVS repository
> cvs -d /home/cvsroot init
7. Restart xinetd
> /etc/init.d/xinetd restart
8. That's all. Now we can login to CVS.
> cvs -d :pserver:cvsroot@localhost:/home/cvsroot login
Saturday, May 19, 2007
Tuesday, March 06, 2007
Saturday, January 27, 2007
BASH Quick Guide
A quick guide to writing scripts using the bash shell
A simple shell script
A shell script is little more than a list of commands that are run in sequence. Conventionally, a shellscript should start with a line such as the following:#!/bin/bash
THis indicates that the script should be run in the bash shell regardless of which interactive shell the user has chosen. This is very important, since the syntax of different shells can vary greatly. A simple example
Here's a very simple example of a shell script. It just runs a few simple commands#!/bin/bash
echo "hello, $USER. I wish to list some files of yours"
echo "listing files in the current directory, $PWD"
ls # list files
Firstly, notice the comment on line 4. In a bash script, anything following a pound sign # (besides the shell name on the first line) is treated as a comment. ie the shell ignores it. It is there for the benifit of people reading the script.
$USER and $PWD are variables. These are standard variables defined by the bash shell itself, they needn't be defined in the script. Note that the variables are expanded when the variable name is inside double quotes. Expanded is a very appropriate word: the shell basically sees the string $USER and replaces it with the variable's value then executes the command.
We continue the discussion on variables below ...
Variables
Any programming language needs variables. You define a variable as follows:X="hello"
and refer to it as follows: $X
More specifically, $X is used to denote the value of the variable X. Some things to take note of regarding semantics: - bash gets unhappy if you leave a space on either side of the = sign. For example, the following gives an error message:
X = hello
- while I have quotes in my example, they are not always necessary. where you need quotes is when your variable names include spaces. For example,
X=hello world # error
X="hello world" # OK
foo=bar
is considered a command. The problem with foo = bar
is the shell sees the word foo
seperated by spaces and interprets it as a command. Likewise, the problem with the command X=hello world
is that the shell interprets X=hello
as a command, and the word "world" does not make any sense (since the assignment command doesn't take arguments). Single Quotes versus double quotes
Basically, variable names are exapnded within double quotes, but not single quotes. If you do not need to refer to variables, single quotes are good to use as the results are more predictable.
An example
The output looks like this (assuming your username is elflord)#!/bin/bash
echo -n '$USER=' # -n option stops echo from breaking the line
echo "$USER"
echo "\$USER=$USER" # this does the same thing as the first two lines
so the double quotes still have a work around. Double quotes are more flexible, but less predictable. Given the choice between single quotes and double quotes, use single quotes.$USER=elflord
$USER=elflord
Using Quotes to enclose your variables
Sometimes, it is a good idea to protect variable names in double quotes. This is usually the most important if your variables value either (a) contains spaces or (b) is the empty string. An example is as follows:
This script will give the following output:#!/bin/bash
X=""
if [ -n $X ]; then # -n tests to see if the argument is non empty
echo "the variable X is not the empty string"
fi
the variable X is not the empty string
Why ? because the shell expands $X to the empty string. The expression [ -n ] returns true (since it is not provided with an argument). A better script would have been: In this example, the expression expands to [ -n "" ] which returns false, since the string enclosed in inverted commas is clearly empty.#!/bin/bash
X=""
if [ -n "$X" ]; then # -n tests to see if the argument is non empty
echo "the variable X is not the empty string"
fi
Variable Expansion in action
Just to convince you that the shell really does "expand" variables in the sense I mentioned before, here is an example:This looks a little enigmatic. What happens with the last line is that it actually executes the command#!/bin/bash
LS="ls"
LS_FLAGS="-al"
$LS $LS_FLAGS $HOME
ls -al /home/elflord
(assuming that /home/elflord is your home directory). That is, the shell simply replaces the variables with their values, and then executes the command. Using Braces to Protect Your Variables
OK. Here's a potential problem situation. Suppose you want to echo the value of the variable X, followed immediately by the letters "abc". Question: how do you do this ? Let's have a try :THis gives no output. What went wrong ? The answer is that the shell thought that we were asking for the variable Xabc, which is uninitialised. The way to deal with this is to put braces around X to seperate it from the other characters. The following gives the desired result:#!/bin/bash
X=ABC
echo "$Xabc"
#!/bin/bash
X=ABC
echo "${X}abc"
Conditionals, if/then/elif
Sometimes, it's necessary to check for certain conditions. Does a string have 0 length ? does the file "foo" exist, and is it a symbolic link , or a real file ? Firstly, we use the if command to run a test. The syntax is as follows:Sometimes, you may wish to specify an alternate action when the condition fails. Here's how it's done.if condition
then
statement1
statement2
..........
fi
alternatively, it is possible to test for another condition if the first "if" fails. Note that any number of elifs can be added.if condition
then
statement1
statement2
..........
else
statement3
fi
if condition1
then
statement1
statement2
..........
elif condition2
then
statement3
statement4
........
elif condition3
then
statement5
statement6
........
fi
The statements inside the block between if/elif
and the next elif
or fi
are executed if the corresponding condition is true. Actually, any command can go in place of the conditions, and the block will be executed if and only if the command returns an exit status of 0 (in other words, if the command exits "succesfully" ). However, in the course of this document, we will be only interested in using "test" or "[ ]" to evaluate conditions.
The Test Command and Operators
The command used in conditionals nearly all the time is the test command. Test returns true or false (more accurately, exits with 0 or non zero status) depending respectively on whether the test is passed or failed. It works like this:test operand1 operator operand2
for some tests, there need be only one operand (operand2) The test command is typically abbreviated in this form: [ operand1 operator operand2 ]
To bring this discussion back down to earth, we give a few examples: #!/bin/bash
X=3
Y=4
empty_string=""
if [ $X -lt $Y ] # is $X less than $Y ?
then
echo "\$X=${X}, which is greater than \$Y=${Y}"
fi
if [ -n "$empty_string" ]; then
echo "empty string is non_empty"
fi
if [ -e "${HOME}/.fvwmrc" ]; then # test to see if ~/.fvwmrc exists
echo "you have a .fvwmrc file"
if [ -L "${HOME}/.fvwmrc" ]; then # is it a symlink ?
echo "it's a symbolic link
elif [ -f "${HOME}/.fvwmrc" ]; then # is it a regular file ?
echo "it's a regular file"
fi
else
echo "you have no .fvwmrc file"
fi
Some pitfalls to be wary of
The test command needs to be in the form "operand1
gives exactly the "wrong" output (ie it echos "hello", since it sees an operand but no operator.)if [ 1=2 ]; then
echo "hello"
fi
Another potential trap comes from not protecting variables in quotes. We have already given an example as to why you must wrap anything you wish to use for a -n
test with quotes. However, there are a lot of good reasons for using quotes all the time, or almost all of the time. Failing to do this when you have variables expanded inside tests can result in very wierd bugs. Here's an example: For example,
This will give misleading output since the shell expands our expression to#!/bin/bash
X="-n"
Y=""
if [ $X = $Y ] ; then
echo "X=Y"
fi
[ -n = ]
and the string "=" has non zero length. A brief summary of test operators
Here's a quick list of test operators. It's by no means comprehensive, but its likely to be all you'll need to remember (if you need anything else, you can always check the bash manpage ... )operator | produces true if... | number of operands |
-n | operand non zero length | 1 |
-z | operand has zero length | 1 |
-d | there exists a directory whose name is operand | 1 |
-f | there exists a file whose name is operand | 1 |
-eq | the operands are integers and they are equal | 2 |
-neq | the opposite of -eq | 2 |
= | the operands are equal (as strings) | 2 |
!= | opposite of = | 2 |
-lt | operand1 is strictly less than operand2 (both operands should be integers) | 2 |
-gt | operand1 is strictly greater than operand2 (both operands should be integers) | 2 |
-ge | operand1 is greater than or equal to operand2 (both operands should be integers) | 2 |
-le | operand1 is less than or equal to operand2 (both operands should be integers) | 2 |
Loops
Loops are constructions that enable one to reiterate a procedure or perform the same procedure on several different items. There are the following kinds of loops available in bash- for loops
- while loops
For loops
The syntax for the for loops is best demonstrated by example.THe for loop iterates the loop over the space seperated items. Note that if some of the items have embedded spaces, you need to protect them with quotes. Here's an example:#!/bin/bash
for X in red green blue
do
echo $X
done
Can you guess what would happen if we left out the quotes in the for statement ? This indicates that variable names should be protected with quotes unless you are pretty sure that they do not contain any spaces.#!/bin/bash
colour1="red"
colour2="light blue"
colour3="dark green"
for X in "$colour1" $colour2" $colour3"
do
echo $X
done
Globbing in for loops
The shell expands a string containing a * to all filenames that "match". A filename matches if and only if it is identical to the match string after replacing the stars * with arbitrary strings. For example, the character "*" by itself expands to a space seperated list of all files in the working directory (excluding those that start with a dot "." ) So
echo *
lists all the files and directories in the current directory. echo *.jpg
lists all the jpeg files. echo ${HOME}/public_html/*.jpg
lists all jpeg files in your public_html directory. As it happens, this turns out to be very useful for performing operations on the files in a directory, especially used in conjunction with a for loop. For example:
#!/bin/bash
for X in *.html
do
grep -L '' "$X"
done
While Loops
While loops iterate "while" a given condition is true. An example of this:
#!/bin/bash
X=0
while [ $X -le 20 ]
do
echo $X
X=$((X+1))
done
This raises a natural question: why doesn't bash allow the C like for loops
for (X=1,X<10;> As it happens, this is discouraged for a reason: bash is an interpreted language, and a rather slow one for that matter. For this reason, heavy iteration is discouraged.Command Substitution
Command Substitution is a very handy feature of the bash shell. It enables you to take the output of a command and treat it as though it was written on the command line. For example, if you want to set the variable X to the output of a command, the way you do this is via command substitution.
There are two means of command substitution: brace expansion and backtick expansion.
Brace expansion workls as follows:
$(commands)
expands to the output of commands This permits nesting, so commands can include brace expansionsBacktick expansion expands
`commands`
to the output of commandsAn example is given;:
#!/bin/bash
files="$(ls )"
web_files=`ls public_html`
echo $files
echo $web_files
X=`expr 3 \* 2 + 4` # expr evaluate arithmatic expressions. man expr for details.
echo $XNote that even though the output of ls contains newlines, the variables do not. Bash variables can not contain newline characters (which is a pain in the butt. But that's life) Anyway, the advantage of the $() substitution method is almost self evident: it is very easy to nest. It is supported by most of the bourne shell varients (the POSIX shell or better is OK). However, the backtick substitution is slightly more readable, and is supported by even the most basic shells (any #!/bin/sh version is just fine)
Friday, January 26, 2007
BASH Tutorial
http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO.html
Conditional Parts of Makefiles
Example of a Conditional
libs_for_gcc = -lgnu
normal_libs =
ifeq ($(CC),gcc)
libs=$(libs_for_gcc)
else
libs=$(normal_libs)
endif
foo: $(objects)
$(CC) -o foo $(objects) $(libs)
Syntax of Conditionals
The syntax of a simple conditional with no else
is as follows:
text-if-true
endif
The text-if-true may be any lines of text, to be considered as part of the makefile if the condition is true. If the condition is false, no text is used instead.
The syntax of a complex conditional is as follows:
conditional-directivetext-if-true
else
text-if-false
endif
or:
conditional-directivetext-if-one-is-true
else conditional-directive
text-if-true
else
text-if-false
endif
Conditional-directive
ifeq, ifneq, ifdef, and ifndefifeq (
arg1,
arg2)
ifeq '
arg1' '
arg2'
ifeq "
arg1" "
arg2"
ifeq "
arg1" '
arg2'
ifeq '
arg1' "
arg2"
ifneq (
arg1,
arg2)
ifneq '
arg1' '
arg2'
ifneq "
arg1" "
arg2"
ifneq "
arg1" '
arg2'
ifneq '
arg1' "
arg2"
ifdef
variable-name
ifndef
variable-name
設定 NFS
> vi /etc/exports
/jannyroot *(rw,async,nohide,no_auth_nlm,no_root_squash)
> service nfs restart
[Client]
mount -t nfs 192.168.0.2:/jannyroot /home/nfs/public
設定 SELinux
關掉 SELinux 的方法:
> cd /etc/selinux
> vi config
將 SELINUX=enforcing
改為 SELINUX=disabled
Makefile Pattern Rule Examples
Here are some examples of pattern rules actually predefined in make
. First, the rule that compiles `.c' files into `.o' files:
%.o : %.c
$(CC) -c $(CFLAGS) $(CPPFLAGS) $< -o $@
defines a rule that can make any file x.o from x.c. The command uses the automatic variables `$@' and `$<' to substitute the names of the target file and the source file in each case where the rule applies (see Automatic Variables).
Here is a second built-in rule:
% :: RCS/%,v
$(CO) $(COFLAGS) $<
defines a rule that can make any file x whatsoever from a corresponding file x,v in the subdirectory RCS. Since the target is `%', this rule will apply to any file whatever, provided the appropriate prerequisite file exists. The double colon makes the rule terminal, which means that its prerequisite may not be an intermediate file (see Match-Anything Pattern Rules).
This pattern rule has two targets:
%.tab.c %.tab.h: %.y
bison -d $<
This tells make
that the command `bison -d x.y' will make both x.tab.c and x.tab.h. If the file foo depends on the files parse.tab.o and scan.o and the file scan.o depends on the file parse.tab.h, when parse.y is changed, the command `bison -d parse.y' will be executed only once, and the prerequisites of both parse.tab.o and scan.o will be satisfied. (Presumably the file parse.tab.o will be recompiled from parse.tab.c and the file scan.o from scan.c, while foo is linked from parse.tab.o, scan.o, and its other prerequisites, and it will execute happily ever after.)
Makefile Implicit Rules
- Compiling C programs
- n.o is made automatically from n.c with a command of the form `$(CC) -c $(CPPFLAGS) $(CFLAGS)'.
- Compiling C++ programs
- n.o is made automatically from n.cc, n.cpp, or n.C with a command of the form `$(CXX) -c $(CPPFLAGS) $(CXXFLAGS)'. We encourage you to use the suffix `.cc' for C++ source files instead of `.C'.
- Compiling Pascal programs
- n.o is made automatically from n.p with the command `$(PC) -c $(PFLAGS)'.
- Assembling and preprocessing assembler programs
- n.o is made automatically from n.s by running the assembler,
as
. The precise command is `$(AS) $(ASFLAGS)'.n.s is made automatically from n.S by running the C preprocessor,
cpp
. The precise command is `$(CPP) $(CPPFLAGS)'. - Linking a single object file
- n is made automatically from n.o by running the linker (usually called
ld
) via the C compiler. The precise command used is `$(CC) $(LDFLAGS) n.o $(LOADLIBES) $(LDLIBS)'.This rule does the right thing for a simple program with only one source file. It will also do the right thing if there are multiple object files (presumably coming from various other source files), one of which has a name matching that of the executable file. Thus,
x: y.o z.o
when x.c, y.c and z.c all exist will execute:
cc -c x.c -o x.o
cc -c y.c -o y.o
cc -c z.c -o z.o
cc x.o y.o z.o -o x
rm -f x.o
rm -f y.o
rm -f z.o
In more complicated cases, such as when there is no object file whose name derives from the executable file name, you must write an explicit command for linking.
Each kind of file automatically made into `.o' object files will be automatically linked by using the compiler (`$(CC)', `$(FC)' or `$(PC)'; the C compiler `$(CC)' is used to assemble `.s' files) without the `-c' option. This could be done by using the `.o' object files as intermediates, but it is faster to do the compiling and linking in one step, so that's how it's done.
Tuesday, January 16, 2007
Colour ls
[ 12 September 1999
The Linux Colours with Linux terminals mini-HOWTO is not being maintained by
the author any more. If you are interested in maintaining the
Colours-ls mini-HOWTO, please get in touch with me at
Colours with Linux terminals
Thorbjørn Ravn Andersen, ravn@dit.ou.dk
v1.4, 7 August 1997
Most Linux distributions have a 'ls' command for listing the contents
of a directory that can visually enhance their output by using differ
ent colours, but configuring this to taste may not be a trivial task.
This document explains the various aspects and approaches of altering
the setup by configuring existing software, plus locations of alterna
tive software usually not included with Slackware or RedHat, which may
be used on most versions of Unix. The HTML version is also available
from my own source at
1. Introduction
In recent years colour displays have become very common, and users are
beginning to exploit this by using programs that utilizes colours to
give quick visual feedback on e.g. reserved keywords in programming
languages, or instant notification of misspelled words.
As the Linux text console supports colour, the original GNU ls was
quickly modified to output colour information and included in
Slackware around version 2.0. Improved versions of these patches have
now migrated into the standard GNU distribution of ls, and should
therefore be a part of all new Linux distributions by now.
This revision is an update on a major rewrite from the initial
release, including information on xterms and kernel patching.
The information in this document has been confirmed on Redhat 4.1, and
was originally compiled with the 2.0.2 release of Slackware, and the
1.1.54 kernel. The kernel patch information was retrieved on
slackware 2.2.0 with the 1.2.13 kernel, and tcsh as the default shell,
and later confirmed with a 2.0.27 kernel. If you use any other
configuration, or unix version, I would appreciate a note stating your
operating system and version, and whether colour support is available
as standard.
2. Quickstart for the impatient
If you have a new distribution of Linux, do these modifications to
these files in your home directory. They take effect after next
login.
~/.bashrc:
alias ls="ls --color"
~/.cshrc:
alias ls 'ls --color'
That's it!
You may also want to do an ``eval `dircolors $HOME/.colourrc`'', to
get your own colours. This file is created with ``dircolors -p
>$HOME/.colourrc'' and is well commented for further editing.
3. Do I have it at all?
First of all you need to know if you have a version of ls which knows
how to colourize properly. Try this command in a Linux text console
(although an xterm will do):
% ls --color
(the % is a shell prompt):
If you get an error message indicating that ls does not understand the
option, you need to install a new version of the GNU fileutils
package. If you do not have an appropriate upgrade package for your
distribution, just get the latest version from your GNU mirror and
install directly from source.
If you do not get an error message, you have a ls which understands
the command. Unfortunately, some of the earlier versions included
previously with Slackware (and possible others) were buggy. The ls
included with Redhat 4.1 is version 3.13 which is okay.
% ls --version
ls - GNU fileutils-3.13
If you ran the ``ls -- color'' command on a Linux textbased console,
the output should have been colourized according to the defaults on
the system, and you can now decide whether there is anything you want
to change.
If you ran it in an xterm, you may or you may not have seen any colour
changes. As with ls itself, the original xterm-program did not have
any support of colour for the programs running inside of it, but
recent versions do. If your xterm doesn't support colours, you should
get a new version as described at the end of this document. In the
meantime just switch to textmode and continue from there.
4. Which colours is there to choose from?
This shell script (thanks to the many who sent me bash versions) shows
all standard colour combinations on the current console. If no
colours appear, your console does not support ANSI colour selections.
#!/bin/bash
# Display ANSI colours.
#
esc="\033["
echo -n " _ _ _ _ _40 _ _ _ 41_ _ _ _42 _ _ _ 43"
echo "_ _ _ 44_ _ _ _45 _ _ _ 46_ _ _ _47 _"
for fore in 30 31 32 33 34 35 36 37; do
line1="$fore "
line2=" "
for back in 40 41 42 43 44 45 46 47; do
line1="${line1}${esc}${back};${fore}m Normal ${esc}0m"
line2="${line2}${esc}${back};${fore};1m Bold ${esc}0m"
done
echo -e "$line1\n$line2"
done
The foreground colour number is listed to the left, and the background
number in the box. If you want bold characters you add a "1" to the
parameters, so bright blue on white would be "37;44;1". The whole
ANSI selection sequence is then
ESC [ 3 7 ; 4 4 ; 1 m
Note: The background currently cannot be bold, so you cannot have
yellow (bold brown) as anything but foreground. This is a hardware
limitation.
The colours are:
0 - black 4 - blue 3# is foreground
1 - red 5 - magenta 4# is background
2 - green 6 - cyan
3 - yellow 7 - white ;1 is bold
5. How to configure colours with ls
If you wish to modify the standard colour set built into ls, you need
your personal copy in your home directory, which you get with
cd ; dircolors -p > .coloursrc
After modifying this well-commented file you need to have it read into
the environment string LS_COLORS, which is usually done with
eval `dircolors .colourrc`
You need to put this line in your .bashrc/.cshrc/.tcshrc (depending on
your shell), to have it done at each login. See the dircolors(1)
manual page for details.
6. How to change the text-mode default from white-on-black
You will need to tell the terminal driver code that you want another
default. There exists no standard way of doing this, but in case of
Linux you have the setterm program.
"setterm" uses the information in the terminal database to set the
attributes. Selections are done like
setterm -foreground black -background white -store
where the "-store" besides the actual change makes it the default for
the current console as well. This requires that the current terminal
(TERM environment variable) is described "well enough" in the termcap
database. If setterm for some reason does not work, here are some
alternatives:
6.1. Xterm
One of these xterms should be available and at least one of them
support colour.
xterm -fg white -bg blue4
color_xterm -fg white -bg blue4
color-xterm -fg white -bg blue4
nxterm -fg white -bg blue4
where 'color_xterm' supports the colour version of 'ls'. This
particular choice resembles the colours used on an SGI.
6.2. Virtual console.
You may modify the kernel once and for all, as well as providing a
run-time default for the virtual consoles with an escape sequence. I
recommend the kernel patch if you have compiled your own kernel.
The kernel source file is /usr/src/linux/drivers/char/console.c around
line 1940, where you should modify
def_color = 0x07; /* white */
ulcolor = 0x0f; /* bold white */
halfcolor = 0x08; /* grey */
as appropriate. I use white on blue with
def_color = 0x17; /* white */
ulcolor = 0x1f; /* bold white */
halfcolor = 0x18; /* grey */
The numbers are the attribute codes used by the video card in
hexadecimal: the most significant digit (the "1" in the example
colours above) is the background; the least significant the
foreground. 0 = black, 1 = blue, 2 = green, 3 = cyan, 4 = red, 5 =
purple, 6 = brown/yellow, 7 = white. Add 8 to get "bright" colours.
Note that, in most cases, a bright background == blinking characters,
dull background. (From sjlam1@mda023.cc.monash.edu.au
You may also supply a new run-time default for a virtual console, on a
per-display basis with the non-standard ANSI sequence (found by
browsing the kernel sources)
ESC [ 8 ]
which sets the default to the current fore- and background colours.
Then the Reset Attributes string (ESC [ m) selects these colours
instead of white on black.
You will need to actually echo this string to the console each time
you reboot. Depending on what you use your Linux box for, several
places may be appropriate:
6.2.1. /etc/issue
This is where "Welcome to Linux xx.yy" is displayed under Slackware,
and that is a good choice for stand-alone equipment (and probably be a
pestilence for users logging in with telnet). This file is created at
boottime (Slackware in /etc/rc.d/rc.S; Redhat in /etc/rc.d/rc.local),
and you should modify lines looking somewhat like
echo ""> /etc/issue
echo Welcome to Linux `/bin/uname -a | /bin/cut -d\ -f3`. >> /etc/issue
to
ESCAPE="
echo "${ESCAPE}[H${ESCAPE}[37;44m${ESCAPE}[8]${ESCAPE}[2J"> /etc/issue
echo Welcome to Linux `/bin/uname -a | /bin/cut -d\ -f3`. >> /etc/issue
This code will home the cursor, set the colour (here white on blue),
save this selection and clean the rest of the screen. The
modification takes effect after the next reboot. Remember to insert
the _literal_ escape character in the file with C-q in emacs or
control-v in vi, as apparently the sh used for executing this script
does not understand the /033 syntax.
6.2.2. /etc/profile or .profile
if [ "$TERM" = "console" ]; then
echo "\033[37;44m\033[8]" #
# or use setterm.
setterm -foreground white -background blue -store
fi
6.2.3. /etc/login or .login
if ( "$TERM" == "console" ) then
echo "\033[37;44m\033[8]"
# or use setterm.
setterm -foreground white -background blue -store
endif
6.3. Remote login
You should be able to use the setterm program as shown above. Again,
this requires that the remote machine knows enough about your
terminal, and that the terminal emulator providing the login supports
colour. In my experience the best vt100 emulation currently available
for other platforms are:
· MS-DOS: MS-Kermit (free, not a Microsoft product)
· Windows 95/NT: Kermit/95 (shareware)
· OS/2: Kermit/95 (shareware). Note though that the
standard telnet understands colours and can be customized locally.
See
7. Software
All the information described here is assuming a GNU/Linux
installation. If you have something else (like e.g. a Sun running X
or so) you can get and compile the actual software yourself.
The colour version of 'xterm' is based on the standard xterm source
with a patch available from any X11R6 site. The xterm distributed
with R6.3 is rumoured to have native colour support, but is untested
by me.
ftp://ftp.denet.dk/pub/X11/contrib/utilities/color-xterm-R6pl5-patch.gz
See the documentation if you use an older version of X. Note: I
haven't tried this myself!
of the several mirrors. Get at least version 3.13.
ftp://ftp.denet.dk/pub/gnu/fileutils-3.XX.tar.gz
I have myself successfully compiled color-ls on Solaris, SunOS and
Irix.
I would appreciate feedback on this text. My e-mail address is
ravn@dit.ou.dk
--
Thorbjørn Ravn Andersen
Monday, January 15, 2007
Friday, January 05, 2007
A Simple Makefile Tutorial
Makefiles are a simple way to organize code compilation. This tutorial does not even scratch the surface of what is possible using make, but is intended as a starters guide so that you can quickly and easily create your own makefiles for small to medium-sized projects.
A Simple Example
Let's start off with the following three files, hellomake.c, hellofunc.c, and hellomake.h, which would represent a typical main program, some functional code in a separate file, and an include file, respectively.
hellomake.c hellofunc.c hellomake.h
#include
int main() {
// call a function in another file
myPrintHelloMake();
return(0);
}
#include
#include
void myPrintHelloMake(void) {
printf("Hello makefiles!\n");
return;
}
/*
example include file
*/
void myPrintHelloMake(void);
Normally, you would compile this collection of code by executing the following command:
gcc -o hellomake hellomake.c hellofunc.c -I.
This compiles the two .c files and names the executable hellomake. The -I. is included so that gcc will look in the current directory (.) for the include file hellomake.h. Without a makefile, the typical approach to the test/modify/debug cycle is to use the up arrow in a terminal to go back to your last compile command so you don't have to type it each time, especially once you've added a few more .c files to the mix.
Unfortunately, this approach to compilation has two downfalls. First, if you lose the compile command or switch computers you have to retype it from scratch, which is inefficient at best. Second, if you are only making changes to one .c file, recompiling all of them every time is also time-consuming and inefficient. So, it's time to see what we can do with a makefile.
The simplest makefile you could create would look something like:
Makefile 1
hellomake: hellomake.c hellofunc.c
gcc -o hellomake hellomake.c hellofunc.c -I.
If you put this rule into a file called Makefile or makefile and then type make on the command line it will execute the compile command as you have written it in the makefile. Note that make with no arguments executes the first rule in the file. Furthermore, by putting the list of files on which the command depends on the first line after the :, make knows that the rule hellomake needs to be executed if any of those files change. Immediately, you have solved problem #1 and can avoid using the up arrow repeatedly, looking for your last compile command. However, the system is still not being efficient in terms of compiling only the latest changes.
One very important thing to note is that there is a tab before the gcc command in the makefile. There must be a tab at the beginning of any command, and make will not be happy if it's not there.
In order to be a bit more efficient, let's try the following:
Makefile 2
CC=gcc
CFLAGS=-I.
hellomake: hellomake.o hellofunc.o
$(CC) -o hellomake hellomake.o hellofunc.o -I.
So now we've defined some constants CC and CFLAGS. It turns out these are special constants that communicate to make how we want to compile the files hellomake.c and hellofunc.c. In particular, the macro CC is the C compiler to use, and CFLAGS is the list of flags to pass to the compilation command. By putting the object files--hellomake.o and hellofunc.o--in the dependency list and in the rule, make knows it must first compile the .c versions individually, and then build the executable hellomake.
Using this form of makefile is sufficient for most small scale projects. However, there is one thing missing: dependency on the include files. If you were to make a change to hellomake.h, for example, make would not recompile the .c files, even though they needed to be. In order to fix this, we need to tell make that all .c files depend on certain .h files. We can do this by writing a simple rule and adding it to the makefile.
Makefile 3
CC=gcc
CFLAGS=-I.
DEPS = hellomake.h
%.o: %.c $(DEPS)
$(CC) -c -o $@ $< $(CFLAGS)
hellomake: hellomake.o hellofunc.o
gcc -o hellomake hellomake.o hellofunc.o -I.
This addition first creates the macro DEPS, which is the set of .h files on which the .c files depend. Then we define a rule that applies to all files ending in the .o suffix. The rule says that the .o file depends upon the .c version of the file and the .h files included in the DEPS macro. The rule then says that to generate the .o file, make needs to compile the .c file using the compiler defined in the CC macro. The -c flag says to generate the object file, the -o $@ says to put the output of the compilation in the file named on the left side of the :, the $< is the first item in the dependencies list, and the CFLAGS macro is defined as above.
As a final simplification, let's use the special macros $@ and $^, which are the left and right sides of the :, respectively, to make the overall compilation rule more general. In the example below, all of the include files should be listed as part of the macro DEPS, and all of the object files should be listed as part of the macro OBJ.
Makefile 4
CC=gcc
CFLAGS=-I.
DEPS = hellomake.h
OBJ = hellomake.o hellofunc.o
%.o: %.c $(DEPS)
$(CC) -c -o $@ $< $(CFLAGS)
hellomake: $(OBJ)
gcc -o $@ $^ $(CFLAGS)
So what if we want to start putting our .h files in an include directory, our source code in a src directory, and some local libraries in a lib directory? Also, can we somehow hide those annoying .o files that hang around all over the place? The answer, of course, is yes. The following makefile defines paths to the include and lib directories, and places the object files in an obj subdirectory within the src directory. It also has a macro defined for any libraries you want to include, such as the math library -lm. This makefile should be located in the src directory. Note that it also includes a rule for cleaning up your source and object directories if you type make clean. The .PHONY rule keeps make from doing something with a file named clean.
Makefile 5
IDIR =../include
CC=gcc
CFLAGS=-I$(IDIR)
ODIR=obj
LDIR =../lib
LIBS=-lm
_DEPS = hellomake.h
DEPS = $(patsubst %,$(IDIR)/%,$(_DEPS))
_OBJ = hellomake.o hellofunc.o
OBJ = $(patsubst %,$(ODIR)/%,$(_OBJ))
$(ODIR)/%.o: %.c $(DEPS)
$(CC) -c -o $@ $< $(CFLAGS)
hellomake: $(OBJ)
gcc -o $@ $^ $(CFLAGS) $(LIBS)
.PHONY: clean
clean:
rm -f $(ODIR)/*.o *~ core $(INCDIR)/*~
So now you have a perfectly good makefile that you can modify to manage small and medium-sized software projects. You can add multiple rules to a makefile; you can even create rules that call other rules. For more information on makefiles and the make function, check out the GNU Make Manual, which will tell you more than you ever wanted to know (really).