From gcc to the autotools
Upcoming SlideShare
Loading in...5
×
 

From gcc to the autotools

on

  • 961 views

From gcc to the autotools

From gcc to the autotools

Statistics

Views

Total Views
961
Views on SlideShare
961
Embed Views
0

Actions

Likes
1
Downloads
38
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

From gcc to the autotools From gcc to the autotools Presentation Transcript

  • From GNU/GCC to the autotools Thierry GAYET
  • PLAN
      • Development’s environment
      • Build process on gnu/linux
      • Tools needs for the development
      • Static and dynamic libraries
      • Manual build
      • Building with GNU/make :
      • Building with the Autotools :
        • Automatic build using the autotools
        • Using the autotools
        • Pragmatic example in action
      • Gnu/make/autotools vs tmake
      • Conclusion
  • 1. Development’s environement
  • Workstation 06/17/10
    • On a workstation, eveything is on the local rootfs :
      • The binutils : ar, ldd, strip, …
      • The toolchain : gcc, system, libraries (libc, libm, libpthread, …), headers, …
      • The debugger : gdb
    • Only one architecture is in use at one time (x86, x86-64, …)so we can only compile for it.
    http://www.pathname.com/fhs/
  • Cross-compilation for an embedded target 06/17/10 SDK (stagingdir) : Temporally rootfs dedicated to The target arch If several should be used, several SDK should be used too.
    • Toolchain :
    • gcc
    • headers
    • host tools
    Local rootfs Gestion de version Intermediate installation Used for compilation (headers) , and links (libraries) Real rootfs for the Embedded target Real embedded target Firmwares Flashing NFS BOOT Final installation (on the rootfs)
  • 2. Build process on gnu/linux
  • Summary 06/17/10 Source code .c / .cpp Headers .h Preprocessing gcc -E Linking ld Libraririe dynamique .so Compiling gcc -c Librairie statique .a Objetcs .o .o Output ELF Object Assembler as
  • C preprocessing 06/17/10
      • Lexical preprocessors ;
      • Operate on the source text ;
      • Prior to any parsing by performing simple substitution of tokenized character sequences ;
      • Typically perform macro substitution by inlining and templates, textual inclusion of other files, and conditional compilation or inclusion ;
      • takes lines beginning with '#' as directives.
    Examples of usage for using the C preprocessor on a C file : $ gcc -E file.c  this is the same command used for generating an intermediate object but the –o paramter is not given)
  • Assembler 06/17/10 If the –S parameter is given, s top after the stage of compilation proper; do not assemble. The output is in the form of an assembler code file for each non-assembler input file specified. By default, the assembler file name for a source file is made by replacing the suffix .c, .i, etc., with .s. Input files that don't require compilation are ignored . Examples for having the assembly code of a .c source code : $ gcc -S file.c http://homepage.fudan.edu.cn/~euler/gcc_asm/
  • Compiling : the frontend of the compiler 06/17/10
      • Parsing of the source code through a lexical/semantic analyzer (such as gnu/flex) ;
      • Build an internal representation of the program ;
      • In a first step it generates non-optimized intermediate code ;
      • In a second step it generates optimized intermediate code (if required) ;
      • The intermediate code is adapted to the target architecture.
    Examples for generating an intermediate object : $ gcc -c file.c -o file.o $ file file.o file.o: ELF 32-bit LSB relocatable, Intel 80386, version 1 (SYSV), not stripped GCC = GNU Compiler collection GCC C JAVA Fortran Pascal ASM ...
  • 06/17/10 Linking : the backend of the compiler
      • Generates assembly code for the final output ;
      • Can generates either ELF object (replace the old a.out one) ;
      • Uses ld through the gcc ;
      • Can make:
        • Binary : adds a main entry to the intermediate code
        • dynamic library : if the -shared parameter have been given (cannot be runnable except for the libc)
    • Direct usage with the ld linker : $ ld -o mybinary /lib/crt0.o file.o –lc
    • crt0 (or crt0.o, gcrt0.o, mcrt0.o) is a set of execution startup routines (usually part of the C standard library) that are platform-dependent, and is required in order to compile using GCC and other GNU tools. crt stands for "C runtime".
    • -e entry Use entry as the explicit symbol for beginning execution of your program, rather than the default entry point (main). If there is no symbol named entry, the linker will try to parse entry as a number, and use that as the entry address (the number will be interpreted in base 10; you may use a leading 0x for base 16, or a leading 0 for base 8).
  • 3. Tools needs for the development
  • The toolchain, the heart of the development process A toolchain is the set of programming tools that are used to create a product (typically another computer program or system of programs). The tools may be used in a chain, so that the output of each tool becomes the input for the next, but the term is used widely to refer to any set of linked development tools. A simple software development toolchain consists of a text editor for editing source code, a compiler and linker to transform the source code into an executable program, libraries to provide interfaces to the operating system, and a debugger. A complex product such as a video game needs tools for preparing sound effects, music, textures, 3-dimensional models, and animations, and further tools for combining these resources into the finished product. On a workstation, the toolchain is made by all the tools used for compilation (gcc, gdb, ...). It includes all the headers, native libraries that can be used for the compilation and the linking. Beside, a toolchain is mostly used for cross-compiling code for another target. In order to use the right one, a prefix is used (sh4-linux-gcc, … ) ; it include the binutils inside.
  • The binutils The GNU Binutils are a collection of binary tools. The main ones are: * ld - the GNU linker. * as - the GNU assembler. But they also include: * addr2line - Converts addresses into filenames and line numbers. * ar - A utility for creating, modifying and extracting from archives. * gprof - Displays profiling information. * nm - Lists symbols from object files. * objcopy - Copys and translates object files. * objdump - Displays information from object files. * ranlib - Generates an index to the contents of an archive. * readelf - Displays information from any ELF format object file. * size - Lists the section sizes of an object or archive file. * strings - Lists printable strings from files. * strip - Discards symbols. For a cross-compilation usage a prefix is add before the name of the binutils (sh4-linux-ar, sh4-linux-nm, sh4-linux-strip, .... )
  • 4. Static and dynamic libraries
  • 06/17/10 Summary of the differences
    • Static libraries:
      • Integrate all symbols in one binary
      • The nm command display internal only buildin functions
    • Dynamic libraries :
      • Uses external symbols through dynamic libraries
      • The nm command show the internal (T) or external (U) symbols ;
      • The ldd command display the libraries dependencies to the binary.
    BINARY STATIC LIB BINARY DYNAMIC LIB
  • 06/17/10 Static libraries : creation and extraction
    • Creating a static library is an easy task thank to the ar tool from the binutils :
    • $ ar –rv mystaticlibrary.a file1.o file2.o file3.o
    • or
    • $ ar –rv mystaticlibrary.a *.o
    •  The previous command merge the three objects (test1.o, test2.o et test3.o) in one archive.
    • A static library is:
      • a tank for ELF objects
      • similar to an archive such as zip or tar
      • have a .a extension
    • A static library is not linked but is an archive make by the ar binutil. Thus, This is possible extract it the buildin objects.
    • Extracting the content of a static library use the same tool: $ ar x mystaticlibrary.a
    • Some tools such as Midnight commanger (mc) can browse into.
    •  For more information: man ar
  • 06/17/10 Static libraries : tests Il est aussi possible de lister le contenu des fichiers objets inclus dans la librairie : $ ar –t mystaticlibrary cominter.o com_util.o filoint er.o miscell.o paral.o pilot.o simul_api.o spyinter.o userint.o La commande suivante permet de lister la liste des symboles par objets : $ nm mystaticlibrary.a ad_server.o: 00000099 T affichage_etat_client 00000004 C bDebugAd 00000004 C bDebugSu U bTrace 00000390 T close_socket cominter.o: 00000004 d bComInit 00000000 d bNoInitWarn 00000010 b bTrComInter U : undefined (implémentation externe) T : implémenté à l’intérieur « Pour un objet cela liste les fonctions et pour une librairie les fonctions par objet . »
  • 06/17/10 Static libraries : linking
    • A link with a static library will bring all the functions used within the source code inside the final library.
    • The final symbol table will be a merge of the function used inside the source code and the functions from the static library:
      • $ gcc test.o mystaticlibrary.a –o test
    • Because a static library is like an archive that contain ELF objects, a link with a static library is similar to a link with other ELF objects:
    • $ gcc –c file.o mystaticlibrary.a –o test
    • or
      • $ gcc –c file.o file1.o file2.o –o test
    •  with libtest.a that contain two files file1.o and file2.o ; this is often use static library because only the symbols in use are include in the final binary.
    test.o libtest.a
  • 06/17/10 Dynamic libraries : creation Creating a dynamic library need to link al lthe objects into one ELF object. We don’t need any other binutil, but the ld linker itself through gcc : $ gcc -Wall -fPIC -c test1.c –o test1.o $ gcc -Wall -fPIC -c test2.c -0 test2.o $ gcc -shared -Wl,-soname,libtest.so.1 -o libtest.so.1.0 test1.o test2.o Listing of parameters that can be given to the compiler Those libraries have a .so extension but they are associated with a version (Minor and Major). Pour plus d’information : man ld Compiler options Definitions -Wall include all warnings. See man page for warnings specified. -fPIC Compiler directive to output position independent code, a characteristic required by shared libraries. Also see -fpic. -shared Produce a shared object which can then be linked with other objects to form an executable. -W1 Pass options to linker. In this example the options to be passed on to the linker are: ”-soname libctest.so.1”. The name passed with the ”-o” option is passed to gcc. -o Output of operation. In this case the name of the shared object to be output will be libctest.so.1.0
  • 06/17/10 Dynamic libraries : dependencies and runtime Once a binary is build this is possible to get a lising of its dynamic dependencies $ ldd mybinary libanasm7.so => /home/tgayet/vittam2/lib.i386_linux/libanasm7.so (0x40017000) libc.so.6 => /lib/tls/libc.so.6 (0x42000000) /lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000) This command say with which libraries the binary is link ; it give also the path of the libraries. The nm tools can also be use in order to have the symboles (function build inside). The library shouldn’t not be stripped. At the runtime, a binary link with a dynamic library, will check the local cache through the ldconfig tool. Previously the configuration was made with the /etc/ld.so.conf but it have been replaced by the /etc/ld.so.conf.d/ (or /etc/ld.conf.d/). Regerating the cache : $ sudo ldconfig Dumping the cache : $ sudo ldconfig -p A dynamic library is link with a binary only during the runtime, so by modifing it, we can override through the LD_LIBRARY_PATH environment variable: $ export LD_LIBRARY_PATH=./my_lib_path $ ./mybinary
  • 06/17/10 Dynamic libraries : linking In order to link a dynamic library with a biry we can specify some information to the ld linker: $ gcc -fPIC test.o –L./libpath –ltest This command say that the test.o source code that will make the final binary will be dynamically link with the libtest.so library. The « lib » prefix should not be specify because it is automaticaly added (For libtest.so, only test should be specify to the –l parameter). The –l<libname> parameter ask to the linker to search a library called libname. By default, a dynamic (libname.so) one is used except is the –static parameter is provide. DYNAMIC LIB BINARY Linker option Description -l Provide a libname (eg: for libtest, only –ltest is specify) -L Provide a path or a set for the libraries
  • 06/17/10 Dynamic libraries : pmap / exmap cc $ pmap -x 22388 22388: bash Address Kbytes RSS Anon Locked Mode Mapping 08048000 780 - - - r-x-- bash 0810b000 4 - - - r---- bash 0810c000 20 - - - rw--- bash 0021f000 1356 - - - r-x-- libc-2.11.1.so 00372000 4 - - - ----- libc-2.11.1.so 00373000 8 - - - r---- libc-2.11.1.so 00375000 4 - - - rw--- libc-2.11.1.so 009a9000 108 - - - r-x-- ld-2.11.1.so 009c4000 4 - - - r---- ld-2.11.1.so 009c5000 4 - - - rw--- ld-2.11.1.so 00cc4000 24 - - - r-x-- libnss_compat-2.11.1.so 00cca000 4 - - - r---- libnss_compat-2.11.1.so 00ccb000 4 - - - rw--- libnss_compat-2.11.1.so 00e17000 8 - - - r-x-- libdl-2.11.1.so 00e19000 4 - - - r---- libdl-2.11.1.so 00e1a000 4 - - - rw--- libdl-2.11.1.so 00f99000 76 - - - r-x-- libnsl-2.11.1.so 00fac000 4 - - - r---- libnsl-2.11.1.so 00fad000 4 - - - rw--- libnsl-2.11.1.so bffbe000 84 - - - rw--- [ stack ] -------- ------- ------- ------- ------- total kB 9748 - - -
    • Exemple of result of the memory dump for the bash binary.
    • The dynamic libraries are load in memory only one time even if used by several one.
    • They are very usefull in order to know
    • How many memory is used by a binary.
    • Another command may be with the exmap.
  • 5. Manual build
  • 06/17/10 Manual build
    • Compiling :
      • Obtaining an intermediate object : $ gcc -c my_file.c -o my_file.o (The –c flag should be used only for individual compilation)
    • Linking :
      • Standalone Binary (ELF object) $ gcc my_file.o -o my_bin
      • Binary with dynamic link: $ gcc -L<LIB_PATH> -lmy_lib my_file.o -o my_bin
      • Binary with static link:
      • $ gcc -static -L<LIB_PATH> -lmy_lib my_file.o -o my_bin
      • $ gcc my_file.o my_lib.a -o my_bin
  • Debug mode Gcc have a set of parameter that can be used for customizing the debug mode : For more information : man gcc Parameters Description -g Produce debugging information in the operating system's native format (stabs, COFF, XCOFF, or DWARF 2). GDB can work with this debugging information. -ggdb Produce debugging information for use by GDB. This means to use the most expressive format available (DWARF 2, stabs, or the native format if neither of those are supported), including GDB extensions if at all possible. -Werror All warning become errors -Wall Enable all warning
  • Optimization mode Gcc have a set of parameter that can be used for customizing the debug mode : If multiple -O options, with or without level numbers, the last such option is the one that is effective. By definition, no optimization should be set (-O0) if the debug mode is used. Parameters Description -O -O1 Optimize. Optimizing compilation takes somewhat more time, and a lot more memory for a large function. -O2 Optimize even more. GCC performs nearly all supported optimizations that do not involve a space-speed gcc-3.3.2 Last change: 2003-10-16 57 GNU GCC(1) tradeoff. The compiler does not perform loop unrolling or function inlining when you specify -O2. As compared to -O, this option increases both compilation time and the performance of the generated code. -O3 Optimize yet more. -O3 turns on all optimizations specified by -O2 and also turns on the -finline-functions and -frename-registers options. -O0 Do not optimize. This is the default. -Os Optimize for size. -Os enables all -O2 optimizations that do not typically increase code size. It also performs further optimizations designed to reduce code size. -Os disables the following optimization flags: -falign-functions -falign-jumps -falign-loops -falign-labels -freorder-blocks -fprefetch-loop-arrays
  • Conditional compilation In order to customize the source code that will be compiled, we can provide to the C preprocessor, some flags for the compilation (CFLAGS) that will enable or not some part of code. Example with a Part of code to include : #ifdef CONDITION printf(“If you see this message, that's mean that this code is enable.“); #endif In order to say to enable this part of code, we use the following command : $ gcc –c –o test.o –DCONDITION test.c or $ gcc –c –o test.o –DCONDITION=1 test.c Other C preprocessor commands : #ifdef : if defined #ifndef : if not defined #if : if #elif : else if #endif : end of if or ifdef condition Example of usage : #if (__cplusplus==199711L) … #endif #ifndef OnceTime #define OnceTime (To include only one time) #endif
  • Buildin covering with gnu/cov
    • gcov is a test coverage program. Use it in concert with GCC to analyze your programs to help create more efficient, faster running code and to discover untested parts of your program. You can use gcov as a profiling tool to help discover where your optimization efforts will best affect your code. You can also use gcov along with the other profiling tool, gprof, to assess which parts of your code use the greatest amount of computing time.
    • Profiling tools help you analyze your code's performance. Using a profiler such as gcov or gprof, you can find out some basic performance statistics, such as:
      • how often each line of code executes
      • what lines of code are actually executed
      • how much computing time each section of code uses
    • $ gcc -fprofile-arcs -ftest-coverage -g tableau.c -o tableau
    • $ gcov tableau.c
    • 92.31% of 26 source lines executed in file tableau.c
    • Creating tableau.c.gcov.
    • $ more tableau.c.gcov
    • void permuter_cases (char *tableau, int i1, int i2, int taille)
    • 20 {
    • 20 char inter;
    • 20 if( (i1 < 0) || (i2 < 0) || (i1 >= taille) || (i2 >= taille))
  • Buildin profiling with gnu/prof 1/3 The first step in generating profile information for your program is to compile and link it with profiling enabled. To compile a source file for profiling, specify the `-pg' option when you run the compiler. (This is in addition to the options you normally use.) To link the program for profiling, if you use a compiler such as cc to do the linking, simply specify `-pg' in addition to your usual options. The same option, `-pg', alters either compilation or linking to do what is necessary for profiling. Here are examples: $ gcc -g -c myprog.c utils.c -pg  compilation $ gcc -o myprog myprog.o utils.o –pg  link The `-pg' option also works with a command that both compiles and links: $ gcc -o myprog myprog.c utils.c -g -pg Note: The `-pg' option must be part of your compilation options as well as your link options. If it is not then no call-graph data will be gathered and when you run gprof you will get an error message like this: gprof: gmon.out file is missing call-graph data If you add the `-Q' switch to suppress the printing of the call graph data you will still be able to see the time samples: Flat profile: Each sample counts as 0.01 seconds. % cumulative self self total time seconds seconds calls ms/call ms/call name 33.34 0.02 0.02 7208 0.00 0.00 open 16.67 0.03 0.01 244 0.04 0.12 offtime 16.67 0.04 0.01 8 1.25 1.25 memccpy 16.67 0.05 0.01 7 1.43 1.43 write
  • Profiling with gnu/prof 2/3
    • Gprof profiling is similar in some ways to prof profiling. Instead of prof's option -p, the usual option to enable gprof profiling is -pg. The linker links against a different mcount() function which maintains exact counts of entries into each function by individual call sites, probably by walking the stack at run-time to find the address the called function will return to.
    • The gprof post-processor then constructs the call graph for the program, and propagates function execution time (obtained from the PC sampling) through the call graph, proportionally to the number of calls from each call site for the function. The resulting weighted call graph gives a more thorough picture of inefficiencies in the program; however the call graph may be substantially inaccurate when:
    • Propagating execution time meaningfully is difficult when there is recursion (i.e., the call graph is not a tree).
    • The heuristic of allocating execution time of a function to its call sites proportionally to the number of calls from each call site fails because different call sites made substantially different demands on the function. E.g., a function might be called equal number of times from location A and B, but the average latency for calls from A might be 100 times longer than the average latency for calls from B; nevertheless, gprof would assign equal amounts of time to be propagated up the call graph to locations A and B. Sample of results :
    Gprof with pthread require some adaptation of the code.
  • Profiling with gnu/prof 3/3 The remaining functions in the listing (those whose self seconds field is 0.00) didn't appear in the histogram samples at all. However, the call graph indicated that they were called, so therefore they are listed, sorted in decreasing order by the calls field. Clearly some time was spent executing these functions, but the paucity of histogram samples prevents any determination of how much time each took. Here is what the fields in each line mean : % time This is the percentage of the total execution time your program spent in this function. These should all add up to 100%. cumulative seconds This is the cumulative total number of seconds the computer spent executing this functions, plus the time spent in all the functions above this one in this table. self seconds This is the number of seconds accounted for by this function alone. The flat profile listing is sorted first by this number. calls This is the total number of times the function was called. If the function was never called, or the number of times it was called cannot be determined (probably because the function was not compiled with profiling enabled), the calls field is blank. self ms/call This represents the average number of milliseconds spent in this function per call, if this function is profiled. Otherwise, this field is blank for this function. total ms/call This represents the average number of milliseconds spent in this function and its descendants per call, if this function is profiled. Otherwise, this field is blank for this function. This is the only field in the flat profile that uses call graph analysis. Name This is the name of the function. The flat profile is sorted by this field alphabetically after the self seconds and calls fields are sorted.
  • 06/17/10 6. Building with GNU/MAKE
  • 06/17/10 Automatic build using gnu/make The make program gets its dependency &quot;graph&quot; from a text file called makefile or Makefile which resides in the same directory as the source files. Make checks the modification times of the files, and whenever a file becomes &quot;newer&quot; than something that depends on it, (in other words, modified) it runs the compiler accordingly. Project1 is a target and the name of the final binary. Its dependencies are the three objects data.o, main.o and io.o. Each one have in teir turn their own dependencies that make will resolve one by one. If you edit io.c, it becomes &quot;newer&quot; than io.o, meaning that make must run cc -c io.c to create a new io.o, then run cc data.o main.o io.o -o project1 for project1.
  • 06/17/10 Automatic build using gnu/make Each dependency shown in the graph is circled with a corresponding color in the Makefile, and each uses the following format: target : source file(s) command (must be preceded by a tab) A target given in the Makefile is a file which will be created or updated when any of its source files are modified. The command(s) given in the subsequent line(s) (which must be preceded by a tab character) are executed in order to create the target file. For more information : http://www.gnu.org/doc/doc.html
  • 06/17/10 Automatic build using gnu/make
    • By default make will look for a “Makefile” file. We can specify another one through the –f parameter : $ make –f myMakefile  will use myMakefile instead
    • $ make –C /myProject  will use the makefile present in /myProject directory
    • If not Makefile is found, the make program will display: make: *** No targets specified and no makefile found. Stop.
    • Make is adapt both for native code and cross-compilation. Adding a specific rules for using a toolchain is very easy:
    • CC=$(PREFIX)-gcc
    • .c.o:
    • $(CC) –c $(CFLAG) –o $@ $<
    • Can get setting from environment variables or parameters:
      • export TOOLCHAIN_BIN=<PATH_TOOLCHAIN>/bin
      • export PREFIX=sh4-linux
    • or
      • make ARCH=sh4-linux
    • then launch the “make” command
  • 06/17/10 Automatic build using gnu/make In addition to those macros which you can create yourself, there are a few macros which are used internally by the make program. Here are some of those, listed below: You can also manipulate the way these macros are evaluated, as follows, assuming that OBJS = data.o io.o main.o, using $(OBJS:.o=.c) within the Makefile substitutes .o at the end with .c, giving you the following result: data.c io.c main.c For debugging a Makefile : $ make -d CC Contains the current C compiler. Defaults to gcc. CFLAGS Special options which are added to the built-in C rule. $@ Full name of the current target. $? A list of files for current dependency which are out-of-date. $< The source file of the current (single) dependency. LDFLAGS Special options which are added to the link.
  • 06/17/10 Using gnu/make
    • Compilation and link :
      • Uses “make” from the command line (it will call the default target such as “all”)
      • Will look for the current Makefile or another file given with -f)
      • Doesn't include lot of default rules and everything should be manually implemented
    • Installation :
      • Uses “make install” for the command line
      • Not in standard … should be done by your own
      • The install/cp command can be used
    • Cleaning:
      • Uses “make clean” from the command line
      • Remove temporally and intermediate objects generate
      • during the build process
    • Archive/delivery :
      • Uses “make delivery” from the command line
      • Not in standard … should be also done by your own
      • The tar.gz/tar.bz2/.7z/zip/... compression algorithm can be used
    Input file Transformation Output file .c .o
  • 06/17/10 7. Building with the autotools
  • 06/17/10 Automatic build using the autotools
    • The autotools are a set of scripts (automake, autoconf, aclocal, autoheader, libtools, … )
    • Lot of build-in rules need for transformations (.c->.o, .c->.s, .o->binary, .c->.a, .c->.so, … )
    • Developers only writes basic templates (configure.ac and makefile.am)
    • Based on gnu/Makefile but almost no need to know how to write them
    • Lot of build-in feature :
      • Launch the build: make
      • Clean the environment: make clean or make distclean
      • Install the build into the stagindir: make DESTDIR=<PATH> install
      • Generate a archive of the current build: make dist
      • Many other useful targets.. .
  • Autotool process overview Makefile.am configure.ac aclocal.m4 configure xxx.pc.in config.h.in
    • For each package:
    • provide by the developers
    • stored in repository
    Makefile.in config.cache config.log config.h Makefile xxx.pc autogen.sh  automake autoheader aclocal autoconf USER VIEW DEV. VIEW
  • 06/17/10 Automatic build using the autotools List of the most useful targets that the GNU Coding Standards specify : make all Build programs, libraries, documentation, etc. (same as make). make install Install what needs to be installed, copying the files from the package's tree to system-wide directories. Same as make install, then strip debugging symbols. Some users like to trade space for useful bug reports... make install-strip Same as make install, then strip debugging symbols. Some users like to trade space for useful bug reports... make uninstall The opposite of make install: erase the installed files. (This needs to be run from the same build tree that was installed.) make clean Erase from the build tree the files built by make all. make distclean Additionally erase anything ./configure created. make check Run the test suite, if any. make installcheck Check the installed programs or libraries, if supported. make dist Recreate package-version.tar.gz from all the source files.
  • 06/17/10 Example 1 : basic project makefile.am configure.ac src/ include/  If you prefer to generate intermediate object in a obj/ directory (or src/), you can move the makefile.am to the directory choosen). autogen.sh Contains all the source code Contains all the headers Template for the project Template for the Makefile Generate the final files The configure.in are a copy of the configure.ac
  • 06/17/10 Example 1 : advanced project makefile.am configure.ac src/ include/ autogen.sh Template for the project Main template for the Makefile Generate the final files module1/ module2/ makefile.am Sub-template1 for the Makefile src/ include/ makefile.am Sub-template2 for the Makefile
  • 06/17/10 Using an autotools project at a glance Then, using this autotools project is simple : 1. Generating the final files from the autotools templates: $ ./autotgen.sh makefile.am  Makefile.in configure.ac  configure 2. Launch the configure script with parameters: $ ./configure –prefix=/usr –enable-debug 3. Launch the compilation: $ make 4. Install the files: $ make DESTDIR=/stagingdir install 5. Launch the unitary tests: $ make check 6. Ask to generate the documentation: $ make html  All the intermediate files have a .in extension ; they will be use as input by the configure script for generating all the final files.
  • 06/17/10 The configure.ac template AC_PREREQ(2.59) AC_INIT([pyPackage], [myPackageVersion], [ [email_address] ]) AC_ARG_ENABLE(debug, AS_HELP_STRING([--enable-debug], [enable debugging support]), [enable_debug=$enableval], [enable_debug=no] ) if test &quot;$enable_debug&quot; = &quot;yes&quot; ; then CXXFLAGS=&quot;$CXXFLAGS -Wall -ggdb -O0&quot; AC_DEFINE(DEBUG, 1, [Define to enable debug build]) else CXXFLAGS=&quot;$CXXFLAGS -Wall -O2&quot; fi PKG_CHECK_MODULES([DIRECTFB],[directfb],[have_libdirectfb=yes],[have_libdirectfb=no]) if test &quot;$have_libdirectfb&quot; = no ; then AC_MSG_ERROR([Missing directfb-1.4.1 library!!]) fi AC_OUTPUT(Makefile) Example of template for a configure.ac script
  • 06/17/10 Managing version
      • m4_define([gm_os_major_version], [1]) m4_define([gm_os_minor_version], [0]) m4_define([gm_os_micro_version], [0]) m4_define([gm_os_version],           [gm_os_major_version.gm_os_minor_version.gm_os_micro_version]) AC_INIT([gm_os_posix],[gm_os_version],[ [email_address] ]) LT_CURRENT=0 LT_REVISION=0 LT_AGE=0 AC_SUBST(LT_CURRENT) AC_SUBST(LT_REVISION) AC_SUBST(LT_AGE) GM_OS_MAJOR_VERSION=gm_os_major_version GM_OS_MINOR_VERSION=gm_os_minor_version GM_OS_MICRO_VERSION=gm_os_micro_version GM_OS_VERSION=gm_os_major_version.gm_os_minor_version.gm_os_micro_version AC_SUBST(GM_OS_MAJOR_VERSION) AC_SUBST(GM_OS_MINOR_VERSION) AC_SUBST(GM_OS_MICRO_VERSION) AC_SUBST(GM_OS_VERSION) AM_INIT_AUTOMAKE(AC_PACKAGE_NAME, AC_PACKAGE_VERSION)
    In first step we can managing the version of the package :
  • 06/17/10 Checking tools for the build
    • Then we can check if the tools need for the compilation exist and also their version :
      • AC_C_CONST AC_ISC_POSIX AC_HEADER_STDC AC_PROG_CC AC_PROG_CC_STDC AC_PROG_CXX AC_PROG_CPP AC_PROG_LN_S AC_PROG_INSTALL AC_PROG_LIBTOOL AC_PROG_MAKE_SET AC_PATH_PROG([PKG_CONFIG], [pkg-config]) if test -z &quot;$PKG_CONFIG&quot; ; then     AC_MSG_ERROR([pkg-config not found]) fi AC_SUBST(CXX_FOR_BUILD) AM_CONDITIONAL(CROSS_COMPILING, test &quot;$cross_compiling&quot; = &quot;yes&quot;)
  • 06/17/10 Customization Configure script have predefined rules and target (they can be redefined) can be customized using macros : AC_ARG_ENABLE(test, AS_HELP_STRING([--enable-test], [Enable the test unitary support [default=no]]), [case &quot;${enableval}&quot; in yes) have_test=true ;; no) have_test=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-test) ;; esac], [have_test=false]) The listing of the parameters available can be display: $ ./configure --help boolean switch AC_ARG_WITH( optim-level, AS_HELP_STRING([--with-optim-level=<0,1,2,3>],[Provide the optim level to give to gcc as -O<level>]), [current_optim_level=$withval], [current_optim_level=0]) Switch with value Usage: ./configure –enable-test Usage: ./configure –with-optim-level=2
  • 06/17/10 Integrating the debug mode dnl -------------------------------------------- dnl Brief : enable or disable the debug mode dnl Mandatory : no dnl Values : none (just enable the debug mode if set) dnl Default's value : release mode dnl -------------------------------------------- AC_ARG_ENABLE(debug, AS_HELP_STRING([--enable-debug], [Enable the debug support [default=no]]), [case &quot;${enableval}&quot; in yes) have_debug=true ;; no) have_debug=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-debug) ;; esac], [have_debug=false]) dnl -------------------------------------------- dnl Export the conditional HAVE_DEBUG variable to the Makefiles dnl -------------------------------------------- AM_CONDITIONAL(HAVE_DEBUG, $have_debug) dnl -------------------------------------------- dnl Test the have_debug variable and if equal to true dnl -------------------------------------------- AC_MSG_CHECKING([Checking the debug support]) if test &quot;$have_debug&quot; = &quot;true&quot; ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes]) dnl -------------------------------------------- dnl Export the DEBUG variable to the Makefiles dnl -------------------------------------------- AC_DEFINE(DEBUG, 1, [Define to enable debug mode]) dnl -------------------------------------------- dnl Update the CFLAGS and CPPFLAGS with debug options dnl -------------------------------------------- dnl -g : bring symbols in ELF objets dnl -gdb : add-ons for GDB including debug symbols in native format dnl -------------------------------------------- DEBUG_CFLAGS=&quot; -ggdb&quot; DEBUG_CPPFLAGS=&quot; -ggdb&quot; DEBUG_LDFLAGS=&quot;&quot; dnl --------------------------------------- dnl Enable the silent mode dnl --------------------------------------- m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([no])]) else dnl --------------------------------------- dnl Enable the silent mode dnl --------------------------------------- m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([yes])]) dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no]) fi dnl -------------------------------------------- dnl Brief : enable or disable the debug mode dnl Mandatory : no dnl Values : none (just enable the debug mode if set) dnl Default's value : release mode dnl -------------------------------------------- AC_ARG_ENABLE(debug, AS_HELP_STRING([--enable-debug], [Enable the debug support [default=no]]), [case &quot;${enableval}&quot; in yes) have_debug=true ;; no) have_debug=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-debug) ;; esac], [have_debug=false]) dnl -------------------------------------------- dnl Export the conditional HAVE_DEBUG variable to the Makefiles dnl -------------------------------------------- AM_CONDITIONAL(HAVE_DEBUG, $have_debug) dnl -------------------------------------------- dnl Test the have_debug variable and if equal to true dnl -------------------------------------------- AC_MSG_CHECKING([Checking the debug support]) if test &quot;$have_debug&quot; = &quot;true&quot; ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes]) dnl -------------------------------------------- dnl Export the DEBUG variable to the Makefiles dnl -------------------------------------------- AC_DEFINE(DEBUG, 1, [Define to enable debug mode]) dnl -------------------------------------------- dnl Update the CFLAGS and CPPFLAGS with debug options dnl -------------------------------------------- dnl -g : bring symbols in ELF objets dnl -gdb : add-ons for GDB including debug symbols in native format dnl -------------------------------------------- DEBUG_CFLAGS=&quot; -ggdb&quot; DEBUG_CPPFLAGS=&quot; -ggdb&quot; DEBUG_LDFLAGS=&quot;&quot; dnl --------------------------------------- dnl Enable the silent mode dnl --------------------------------------- m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([no])]) else dnl --------------------------------------- dnl Enable the silent mode dnl --------------------------------------- m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([yes])]) dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no]) fi AC_ARG_ENABLE(debug, AS_HELP_STRING([--enable-debug], [Enable the debug support [default=no]]), [case &quot;${enableval}&quot; in yes) have_debug=true ;; no) have_debug=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-debug) ;; esac], [have_debug=false]) AM_CONDITIONAL(HAVE_DEBUG, $have_debug) AC_MSG_CHECKING([Checking the debug support]) if test &quot;$have_debug&quot; = &quot;true&quot; ; then AC_MSG_RESULT([yes]) AC_DEFINE(DEBUG, 1, [Define to enable debug mode]) DEBUG_CFLAGS=&quot; -ggdb&quot; DEBUG_CPPFLAGS=&quot; -ggdb&quot; DEBUG_LDFLAGS=&quot; » m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([no])]) Else m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([yes])]) AC_MSG_RESULT([no]) fi Usage: ./configure –enable-debug
    • If set, it will set all the flags need in debug mode ; il will disable the silent mode.
  • 06/17/10 Integrating some optimization AC_ARG_ENABLE(test, AS_HELP_STRING([--enable-test], [Enable the test unitary support [default=no]]), [case &quot;${enableval}&quot; in yes) have_test=true ;; no) have_test=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-test) ;; esac], [have_test=false]) dnl -------------------------------------------- dnl Export the conditional HAVE_TEST variable to the Makefiles dnl -------------------------------------------- AM_CONDITIONAL(HAVE_TEST, $have_test) dnl -------------------------------------------- dnl Test the have_test variable and if equal to true dnl -------------------------------------------- AC_MSG_CHECKING(Checking the test support) if test &quot;$have_test&quot; != &quot;false&quot; ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes]) dnl -------------------------------------------- dnl Test if the libgtest is well available in the stagingdir dnl If so get the cflags and ldflag from the .pc file dnl -------------------------------------------- PKG_CHECK_MODULES([GTEST],[gtest],[have_libgtest=yes],[have_libgtest=no]) if test &quot;$have_libgtest&quot; = no ; then AC_MSG_ERROR([Missing libgtest library (http://code.google.com/p/googletest/) !!]) fi else dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no]) fi AC_MSG_CHECKING([Optimization level]) AC_ARG_WITH( optim-level, AS_HELP_STRING([--with-optim-level=<0,1,2,3>],[Provide the optim level to give to gcc as -O<level>]), [current_optim_level=$withval], [current_optim_level=0]) if test &quot;$current_optim_level&quot; != &quot;0&quot; ; then dnl override the default optim level case &quot;$current_optim_level&quot; in &quot;1&quot; | &quot;2&quot; | &quot;3&quot; ) ;; *) AC_MSG_ERROR(bad value ${withval} for the optim-level parameter. It must be a number between 1 and 3.) ;; esac fi if test &quot;$have_debug&quot; = &quot;true&quot; ; then current_optim_level=&quot;0&quot; AC_MSG_RESULT([-O$current_optim_level!! Remove all optimization in debug mode.]) else AC_MSG_RESULT([-O$current_optim_level]) fi This is possible to say to gcc which optimization level use for one build : Usage : $ ./configure –with-optim-level=3 By default, no optimization is set ; in debug mode no one is required.
  • 06/17/10 Integrating gnu/prof Usage: ./configure –enable-profiling AC_ARG_ENABLE(profiling, AS_HELP_STRING([--enable-profiling], [Enable the buildin profiling (gnu/gprof) support [default=no]]), [case &quot;${enableval}&quot; in yes) have_profiling=true ;; no) have_profiling=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-profiling) ;; esac], [have_profiling=false]) dnl -------------------------------------------- dnl Test the have_profiling variable and if equal to true dnl -------------------------------------------- AC_MSG_CHECKING([Checking the profiling support]) if test &quot;$have_profiling&quot; = &quot;true&quot; ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes]) dnl -------------------------------------------- dnl Update the CFLAGS, CPPFLAGS and LDFLAGS with gprof options for gcc dnl -------------------------------------------- PROFILING_CFLAGS=&quot; -pg&quot; PROFILING_CPPFLAGS=&quot; -pg&quot; PROFILING_LDFLAGS=&quot; -pg&quot; else dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no]) fi
    • If set, it will give –pg both to the CFLAGS
    • and also to the LDFLAGS.
  • 06/17/10 Integrating gnu/cov dnl -------------------------------------------- dnl Brief : enable or disable the buildin covering mode dnl Mandatory : no dnl Values : none (just enable the covering mode if set) dnl -------------------------------------------- AC_ARG_ENABLE(covering, AS_HELP_STRING([--enable-covering], [Enable the building covering (gnu/gcov) support [default=no]]), [case &quot;${enableval}&quot; in yes) have_covering=true ;; no) have_covering=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-covering) ;; esac], [have_covering=false]) dnl -------------------------------------------- dnl Test the have_covering variable and if equal to true dnl -------------------------------------------- AC_MSG_CHECKING([Checking the covering support]) if test &quot;$have_covering&quot; = &quot;true&quot; ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes]) dnl -------------------------------------------- dnl Update the CFLAGS and CPPFLAGS with gcov options for gcc dnl -------------------------------------------- COVERING_CFLAGS=&quot; -fprofile-arcs -ftest-coverage&quot; COVERING_CPPFLAGS=&quot; -fprofile-arcs -ftest-coverage&quot; COVERING_LDFLAGS=&quot;&quot; else dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no]) fi Usage: ./configure –enable-covering
    • If set, it will give -fprofile-arcs -ftest-coverage
    • to the CFLAGS.
  • 06/17/10 Integrating unit test if HAVE_TEST TESTS = gmos_gtest else TESTS = endif noinst_PROGRAMS = $(TESTS) gmos_gtest_SOURCES = $(top_srcdir)/unit_test/src/test_gm_os.cpp gmos_gtest_CPPFLAGS = -I$(HEADER_DIR) $(GTEST_CFLAGS) gmos_gtest_LDFLAGS = $(top_srcdir)/.libs/libgm_os.a -lpthread gmos_gtest_LDADD = $(GTEST_LIBS) AC_ARG_ENABLE(test, AS_HELP_STRING([--enable-test], [Enable the test unitary support [default=no]]), [case &quot;${enableval}&quot; in yes) have_test=true ;; no) have_test=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-test) ;; esac], [have_test=false]) AM_CONDITIONAL(HAVE_TEST, $have_test) AC_MSG_CHECKING(Checking the test support) if test &quot;$have_test&quot; != &quot;false&quot; ; then AC_MSG_RESULT([yes]) PKG_CHECK_MODULES([GTEST],[gtest],[have_libgtest=yes],[have_libgtest=no]) if test &quot;$have_libgtest&quot; = no ; then AC_MSG_ERROR([Missing libgtest library (http://code.google.com/p/googletest/) !!]) fi Else AC_MSG_RESULT([no]) fi Makefile.am configure.ac Usage: ./configure –enable-test make check
  • 06/17/10 Checking dependencies
    • For a complete automatic build process, it can be usefull to get metadata from a
    • library (version, name, cflags, ldflags, …. ). This is what pkg-config have bring to the
    • oss community :
    • Exemple the resolution of a dependancy :
      • # libglib2.0 dependency PKG_CHECK_MODULES([GLIB2],[glib-2.0],[have_libglib=yes],[have_libglib=no]) dnl make this test is mandatory if test &quot;$have_libglib&quot; = no ; then     AC_MSG_ERROR([Missing libglib-2.0 library]) fi
    • That will check the presence of the .pc file that contains the metdata (usually in /usr/local/lib/pkg-config) : $ pkg-config –exist libname
    • The PKG_CONFIG_PATH should be set to the directory that contain the .pc files.
    • Then the M4 macro will export both the CFLAGS and the CFLAGS :
    • $ pkg-config –cflags libname
    • $ pkg-config –libs libname
  • 06/17/10 Verbose/silent mode Building with all details can be usefull in debug mode but it can be unwanted for the release mode. There are a definition to say if we want to enable the mode silent or not : m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([yes/no])])
  • 06/17/10 Example of Makefile’s template makefile.am HEADER_DIR = $(top_srcdir)/inc # ---------------------------------------------------------------- # Header to be install in the stagingdir (make install) # gm_os is the prefix in <statingdir_path>/usr/include/gm_os/<headers> # ---------------------------------------------------------------- lib_includedir = $(includedir)/gm_os/ lib_include_HEADERS = $(HEADER_DIR)/gm_os_types.h $(HEADER_DIR)/gm_os_trace.h $(HEADER_DIR)/gm_os.h # ---------------------------------------------------------------- # Use for the installation in the stagingdir # ---------------------------------------------------------------- pkgconfigdir = $(libdir)/pkgconfig pkgconfig_DATA = gm_os.pc # ---------------------------------------------------------------- # Compilation and generation of the gm_os library # ---------------------------------------------------------------- lib_LTLIBRARIES = libgm_os.la libgm_os_la_SOURCES = src/gm_os_misc.c src/gm_os_heap.c src/gm_os_message.c libgm_os_la_CFLAGS = -I$(HEADER_DIR) $(GLIB2_CFLAGS) $(MY_DEBUG_CFLAG) libgm_os_la_LDFLAGS = -version-info $(LT_CURRENT):$(LT_REVISION):$(LT_AGE) -lpthread $(MY_DEBUG_LDFLAGS) libgm_os_la_LIBADD = $(GLIB2_LIBS)
  • 06/17/10 Doxygen documentation The documentation is automaticaly generate at the doxygen format (html). For this generation a generic doxygen configuration is used (default-Doxyfile)and is upgrade by the makefile before to launch the documentation: DOXYFILE ?= $(srcdir)/default-Doxyfile html-local: @echo -n &quot;Removing the previous documentation : &quot; @ rm -fR ./html @echo &quot;OK&quot; @echo -n &quot;Preparing the documentation requirement : &quot; @cp $(DOXYFILE) Doxyfile @chmod +w Doxyfile @echo &quot;PROJECT_NAME=@PACKAGE_NAME@&quot; >> Doxyfile @echo &quot;PROJECT_NUMBER=@PACKAGE_VERSION@&quot; >> Doxyfile @echo &quot;INPUT=$(srcdir)/inc&quot; >> Doxyfile @echo &quot;OK&quot; @echo -n &quot;Generating the documentation in doxygen format : &quot; @doxygen > /dev/null 2>&1 @rm -f Doxyfile @echo &quot;OK&quot; Usage: make html  That will create a new html directory that contain the documentation. The main html file is index.html. makefile.am
  • 06/17/10 gnu/make/autotools vs tmake tmake The autotools
    • Need to provide the tmake
    • engine package
    • No documentation
    • No a lot of people know how
    • it works
    • No really actually supported
    • (Not maintain anymore)
    • Use GNU/software such as the binutils
    • Need to know the template's syntax but:
      • Is well documented (internet, books, … )
      • Is well supported by the community
      • Lot of people know them
      • Is becoming a standard in the global oss
      • community
    • Templates make software standalone
    • and portables
  • 06/17/10 Conclusion as a summary
    • What the developer should do ?
    • What the user should do ?
    • Write configure.ac script need for generating final gnu/Makefile
    • Write basic Makefile.am using the autotools
    • Setting : ./configure –prefix=<my_path> ...
    • Compilation : make
    • Installation : make DESTDIR=<stagindir> install
    • Delivery : make dist (will make an archive)
    • Clean : make clean or make distclean
  • 06/17/10 Any question ?