Cross compiler

Last updated

A cross compiler is a compiler capable of creating executable code for a platform other than the one on which the compiler is running. For example, a compiler that runs on a PC but generates code that runs on Android devices is a cross compiler.

Contents

A cross compiler is useful to compile code for multiple platforms from one development host. Direct compilation on the target platform might be infeasible, for example on embedded systems with limited computing resources.

Cross compilers are distinct from source-to-source compilers. A cross compiler is for cross-platform software generation of machine code, while a source-to-source compiler translates from one coding language to another in text code. Both are programming tools.

Use

The fundamental use of a cross compiler is to separate the build environment from target environment. This is useful in several situations:

Use of virtual machines (such as Java's JVM) resolves some of the reasons for which cross compilers were developed. The virtual machine paradigm allows the same compiler output to be used across multiple target systems, although this is not always ideal because virtual machines are often slower and the compiled program can only be run on computers with that virtual machine.

Typically the hardware architecture differs (e.g. coding a program destined for the MIPS architecture on an x86 computer) but cross-compilation is also usable when only the operating system environment differs, as when compiling a FreeBSD program under Linux, or even just the system library, as when compiling programs with uClibc on a glibc host.

Canadian Cross

The Canadian Cross is a technique for building cross compilers for other machines, where the original machine is much slower or less convenient than the target. Given three machines A, B, and C, one uses machine A (e.g. running Windows XP on an IA-32 processor) to build a cross compiler that runs on machine B (e.g. running macOS on an x86-64 processor) to create executables for machine C (e.g. running Android on an ARM processor). The practical advantage in this example is that Machine A is slow but has a proprietary compiler, while Machine B is fast but has no compiler at all, and Machine C is impractically slow to be used for compilation.

When using the Canadian Cross with GCC, and as in this example, there may be four compilers involved

Example of Canadian Cross, scheme.svg

The end-result cross compiler (4) will not be able to run on build machine A; instead it would run on machine B to compile an application into executable code that would then be copied to machine C and executed on machine C.

For instance, NetBSD provides a POSIX Unix shell script named build.sh which will first build its own toolchain with the host's compiler; this, in turn, will be used to build the cross compiler which will be used to build the whole system.

The term Canadian Cross came about because at the time that these issues were under discussion, Canada had three national political parties. [1]

Timeline of early cross compilers

GCC and cross compilation

GCC, a free software collection of compilers, can be set up to cross compile. It supports many platforms and languages.

GCC requires that a compiled copy of binutils is available for each targeted platform. Especially important is the GNU Assembler. Therefore, binutils first has to be compiled correctly with the switch --target=some-target sent to the configure script. GCC also has to be configured with the same --target option. GCC can then be run normally provided that the tools, which binutils creates, are available in the path, which can be done using the following (on UNIX-like operating systems with bash):

PATH=/path/to/binutils/bin:${PATH} make

Cross-compiling GCC requires that a portion of the target platform's C standard library be available on the host platform. The programmer may choose to compile the full C library, but this choice could be unreliable. The alternative is to use newlib, which is a small C library containing only the most essential components required to compile C source code.

The GNU Autotools packages (i.e. autoconf, automake, and libtool) use the notion of a build platform, a host platform, and a target platform. The build platform is where the compiler is actually compiled. In most cases, build should be left undefined (it will default from host). The host platform is always where the output artifacts from the compiler will be executed whether the output is another compiler or not. The target platform is used when cross-compiling cross compilers, it represents what type of object code the package will produce; otherwise the target platform setting is irrelevant. [2] For example, consider cross-compiling a video game that will run on a Dreamcast. The machine where the game is compiled is the build platform while the Dreamcast is the host platform. The names host and target are relative to the compiler being used and shifted like son and grandson. [3]

Another method popularly used by embedded Linux developers involves the combination of GCC compilers with specialized sandboxes like Scratchbox and Scratchbox 2, or PRoot. These tools create a "chrooted" sandbox where the programmer can build up necessary tools, libc, and libraries without having to set extra paths. Facilities are also provided to "deceive" the runtime so that it "believes" it is actually running on the intended target CPU (such as an ARM architecture); this allows configuration scripts and the like to run without error. Scratchbox runs more slowly by comparison to "non-chrooted" methods, and most tools that are on the host must be moved into Scratchbox to function.

Manx Aztec C cross compilers

Manx Software Systems, of Shrewsbury, New Jersey, produced C compilers beginning in the 1980s targeted at professional developers for a variety of platforms up to and including IBM PC compatibles and Macs.

Manx's Aztec C programming language was available for a variety of platforms including MS-DOS, Apple II, DOS 3.3 and ProDOS, Commodore 64, Mac 68k [4] and Amiga.

From the 1980s and continuing throughout the 1990s until Manx Software Systems disappeared, the MS-DOS version of Aztec C [5] was offered both as a native mode compiler or as a cross compiler for other platforms with different processors including the Commodore 64 [6] and Apple II. [7] Internet distributions still exist for Aztec C including their MS-DOS based cross compilers. They are still in use today.

Manx's Aztec C86, their native mode 8086 MS-DOS compiler, was also a cross compiler. Although it did not compile code for a different processor like their Aztec C65 6502 cross compilers for the Commodore 64 and Apple II, it created binary executables for then-legacy operating systems for the 16-bit 8086 family of processors.

When the IBM PC was first introduced it was available with a choice of operating systems, CP/M-86 and PC DOS being two of them. Aztec C86 was provided with link libraries for generating code for both IBM PC operating systems. Throughout the 1980s later versions of Aztec C86 (3.xx, 4.xx and 5.xx) added support for MS-DOS "transitory" versions 1 and 2 [8] and which were less robust than the "baseline" MS-DOS version 3 and later which Aztec C86 targeted until its demise.

Finally, Aztec C86 provided C language developers with the ability to produce ROM-able "HEX" code which could then be transferred using a ROM burner directly to an 8086 based processor. Paravirtualization may be more common today but the practice of creating low-level ROM code was more common per-capita during those years when device driver development was often done by application programmers for individual applications, and new devices amounted to a cottage industry. It was not uncommon for application programmers to interface directly with hardware without support from the manufacturer. This practice was similar to Embedded Systems Development today.

Thomas Fenwick and James Goodnow II were the two principal developers of Aztec-C. Fenwick later became notable as the author of the Microsoft Windows CE kernel or NK ("New Kernel") as it was then called. [9]

Microsoft C cross compilers

Early history – 1980s

Microsoft C (MSC) has a shorter history than others [10] dating back to the 1980s. The first Microsoft C Compilers were made by the same company who made Lattice C and were rebranded by Microsoft as their own, until MSC 4 was released, which was the first version that Microsoft produced themselves. [11]

In 1987, many developers started switching to Microsoft C, and many more would follow throughout the development of Microsoft Windows to its present state. Products like Clipper and later Clarion emerged that offered easy database application development by using cross language techniques, allowing part of their programs to be compiled with Microsoft C.

Borland C (California company) was available for purchase years before Microsoft released its first C product.

Long before Borland, BSD Unix (Berkeley University) had gotten C from the authors of the C language: Kernighan and Ritchie who wrote it in unison while working for AT&T (labs). K&R's original needs was not only elegant 2nd level parsed syntax to replace asm 1st level parsed syntax: it was designed so that a minimal amount of asm be written to support each platform (the original design of C was ability to cross compile using C with the least support code per platform, which they needed.). Also yesterdays C directly related to ASM code wherever not platform dependent. Today's C (more-so c++) is no longer C compatible and the asm code underlying can be extremely different than written on a given platform (in Linux: it sometimes replaces and detours library calls with distributor choices). Today's C is a 3rd or 4th level language which is used the old way like a 2nd level language.

1987

C programs had long been linked with modules written in assembly language. Most C compilers (even current compilers) offer an assembly language pass (that can be tweaked for efficiency then linked to the rest of the program after assembling).

Compilers like Aztec-C converted everything to assembly language as a distinct pass and then assembled the code in a distinct pass, and were noted for their very efficient and small code, but by 1987 the optimizer built into Microsoft C was very good, and only "mission critical" parts of a program were usually considered for rewriting. In fact, C language programming had taken over as the "lowest-level" language, with programming becoming a multi-disciplinary growth industry and projects becoming larger, with programmers writing user interfaces and database interfaces in higher-level languages, and a need had emerged for cross language development that continues to this day.

By 1987, with the release of MSC 5.1, Microsoft offered a cross language development environment for MS-DOS. 16-bit binary object code written in assembly language (MASM) and Microsoft's other languages including QuickBASIC, Pascal, and Fortran could be linked together into one program, in a process they called "Mixed Language Programming" and now "InterLanguage Calling". [12] If BASIC was used in this mix, the main program needed to be in BASIC to support the internal runtime system that compiled BASIC required for garbage collection and its other managed operations that simulated a BASIC interpreter like QBasic in MS-DOS.

The calling convention for C code, in particular, was to pass parameters in "reverse order" on the stack and return values on the stack rather than in a processor register. There were other programming rules to make all the languages work together, but this particular rule persisted through the cross language development that continued throughout Windows 16- and 32-bit versions and in the development of programs for OS/2, and which persists to this day. It is known as the Pascal calling convention.

Another type of cross compilation that Microsoft C was used for during this time was in retail applications that require handheld devices like the Symbol Technologies PDT3100 (used to take inventory), which provided a link library targeted at an 8088 based barcode reader. The application was built on the host computer then transferred to the handheld device (via a serial cable) where it was run, similar to what is done today for that same market using Windows Mobile by companies like Motorola, who bought Symbol.

Early 1990s

Throughout the 1990s and beginning with MSC 6 (their first ANSI C compliant compiler) Microsoft re-focused their C compilers on the emerging Windows market, and also on OS/2 and in the development of GUI programs. Mixed language compatibility remained through MSC 6 on the MS-DOS side, but the API for Microsoft Windows 3.0 and 3.1 was written in MSC 6. MSC 6 was also extended to provide support for 32-bit assemblies and support for the emerging Windows for Workgroups and Windows NT which would form the foundation for Windows XP. A programming practice called a thunk was introduced to allow passing between 16- and 32-bit programs that took advantage of runtime binding (dynamic linking) rather than the static binding that was favoured in monolithic 16-bit MS-DOS applications. Static binding is still favoured by some native code developers but does not generally provide the degree of code reuse required by newer best practices like the Capability Maturity Model (CMM).

MS-DOS support was still provided with the release of Microsoft's first C++ Compiler, MSC 7, which was backwardly compatible with the C programming language and MS-DOS and supported both 16- and 32-bit code generation.

MSC took over where Aztec C86 left off. The market share for C compilers had turned to cross compilers which took advantage of the latest and greatest Windows features, offered C and C++ in a single bundle, and still supported MS-DOS systems that were already a decade old, and the smaller companies that produced compilers like Aztec C could no longer compete and either turned to niche markets like embedded systems or disappeared.

MS-DOS and 16-bit code generation support continued until MSC 8.00c which was bundled with Microsoft C++ and Microsoft Application Studio 1.5, the forerunner of Microsoft Visual Studio which is the cross development environment that Microsoft provide today.

Late 1990s

MSC 12 was released with Microsoft Visual Studio 6 and no longer provided support for MS-DOS 16-bit binaries, instead providing support for 32-bit console applications, but provided support for Windows 95 and Windows 98 code generation as well as for Windows NT. Link libraries were available for other processors that ran Microsoft Windows; a practice that Microsoft continues to this day.

MSC 13 was released with Visual Studio 2003, and MSC 14 was released with Visual Studio 2005, both of which still produce code for older systems like Windows 95, but which will produce code for several target platforms including the mobile market and the ARM architecture.

.NET and beyond

In 2001 Microsoft developed the Common Language Runtime (CLR), which formed the core for their .NET Framework compiler in the Visual Studio IDE. This layer on the operating system which is in the API allows the mixing of development languages compiled across platforms that run the Windows operating system.

The .NET Framework runtime and CLR provide a mapping layer to the core routines for the processor and the devices on the target computer. The command-line C compiler in Visual Studio will compile native code for a variety of processors and can be used to build the core routines themselves.

Microsoft .NET applications for target platforms like Windows Mobile on the ARM architecture cross-compile on Windows machines with a variety of processors and Microsoft also offer emulators and remote deployment environments that require very little configuration, unlike the cross compilers in days gone by or on other platforms.

Runtime libraries, such as Mono, provide compatibility for cross-compiled .NET programs to other operating systems, such as Linux.

Libraries like Qt and its predecessors including XVT provide source code level cross development capability with other platforms, while still using Microsoft C to build the Windows versions. Other compilers like MinGW have also become popular in this area since they are more directly compatible with the Unixes that comprise the non-Windows side of software development allowing those developers to target all platforms using a familiar build environment.

Free Pascal

Free Pascal was developed from the beginning as a cross compiler. The compiler executable (ppcXXX where XXX is a target architecture) is capable of producing executables (or just object files if no internal linker exists, or even just assembly files if no internal assembler exists) for all OS of the same architecture. For example, ppc386 is capable of producing executables for i386-linux, i386-win32, i386-go32v2 (DOS) and all other OSes (see [13] ). For compiling to another architecture, however, a cross architecture version of the compiler must be built first. The resulting compiler executable would have additional 'ross' before the target architecture in its name. i.e. if the compiler is built to target x64, then the executable would be ppcrossx64.

To compile for a chosen architecture-OS, the compiler switch (for the compiler driver fpc) -P and -T can be used. This is also done when cross-compiling the compiler itself, but is set via make option CPU_TARGET and OS_TARGET. GNU assembler and linker for the target platform is required if Free Pascal does not yet have internal version of the tools for the target platform.

Clang

Clang is natively a cross compiler, at build time you can select which architectures you want Clang to be able to target.

See also

Related Research Articles

<span class="mw-page-title-main">GNU Compiler Collection</span> Free and open-source compiler for various programming languages

The GNU Compiler Collection (GCC) is a collection of compilers from the GNU Project that support various programming languages, hardware architectures and operating systems. The Free Software Foundation (FSF) distributes GCC as free software under the GNU General Public License. GCC is a key component of the GNU toolchain which is used for most projects related to GNU and the Linux kernel. With roughly 15 million lines of code in 2019, GCC is one of the largest free programs in existence. It has played an important role in the growth of free software, as both a tool and an example.

In computing, cross-platform software is computer software that is designed to work in several computing platforms. Some cross-platform software requires a separate build for each platform, but some can be directly run on any platform without special preparation, being written in an interpreted language or compiled to portable bytecode for which the interpreters or run-time packages are common or standard components of all supported platforms.

Bytecode is a form of instruction set designed for efficient execution by a software interpreter. Unlike human-readable source code, bytecodes are compact numeric codes, constants, and references that encode the result of compiler parsing and performing semantic analysis of things like type, scope, and nesting depths of program objects.

<span class="mw-page-title-main">GNU Autotools</span> GNU software packaging tools

The GNU Autotools, also known as the GNU Build System, is a suite of programming tools designed to assist in making source code packages portable to many Unix-like systems.

Windows Embedded Compact, formerly Windows Embedded CE, Windows Powered and Windows CE, is a discontinued operating system developed by Microsoft for mobile and embedded devices. It was part of the Windows Embedded family and served as the foundation of several classes of devices including the Handheld PC, Pocket PC, Auto PC, Windows Mobile, Windows Phone 7 and others.

<span class="mw-page-title-main">DJGPP</span> Implementation of the GNU toolchain for DOS

DJ's GNU Programming Platform (DJGPP) is a software development suite for Intel 80386-level and above, IBM PC compatibles which supports DOS operating systems. It is guided by DJ Delorie, who began the project in 1989. It is a port of the GNU Compiler Collection (GCC), and mostly GNU utilities such as Bash, find, tar, ls, GAWK, sed, and ld to DOS Protected Mode Interface (DPMI). Supported languages include C, C++, Objective-C/C++, Ada, Fortran, and Pascal.

<span class="mw-page-title-main">MinGW</span> Free and open-source software for developing applications in Microsoft Windows

MinGW, formerly mingw32, is a free and open source software development environment to create Microsoft Windows applications.

A fat binary is a computer executable program or library which has been expanded with code native to multiple instruction sets which can consequently be run on multiple processor types. This results in a file larger than a normal one-architecture binary file, thus the name.

The GNU toolchain is a broad collection of programming tools produced by the GNU Project. These tools form a toolchain used for developing software applications and operating systems.

<span class="mw-page-title-main">LLVM</span> Compiler backend for multiple programming languages

LLVM is a set of compiler and toolchain technologies that can be used to develop a frontend for any programming language and a backend for any instruction set architecture. LLVM is designed around a language-independent intermediate representation (IR) that serves as a portable, high-level assembly language that can be optimized with a variety of transformations over multiple passes. The name LLVM originally stood for Low Level Virtual Machine, though the project has expanded and the name is no longer officially an initialism.

<span class="mw-page-title-main">SCons</span>

SCons is a computer software build tool that automatically analyzes source code file dependencies and operating system adaptation requirements from a software project description and generates final binary executables for installation on the target operating system platform. Its function is analogous to the traditional GNU build system based on the make utility and the autoconf tools.

In software engineering, retargeting is an attribute of software development tools that have been specifically designed to generate code for more than one computing platform.

<span class="mw-page-title-main">Watcom C/C++</span>

Watcom C/C++ is an integrated development environment (IDE) product from Watcom International Corporation for the C, C++, and Fortran programming languages. Watcom C/C++ was a commercial product until it was discontinued, then released under the Sybase Open Watcom Public License as Open Watcom C/C++. It features tools for developing and debugging code for DOS, OS/2, Windows, and Linux operating systems, which are based upon 16-bit x86, 32-bit IA-32, or 64-bit x86-64 compatible processors.

<span class="mw-page-title-main">CMake</span> Cross-platform, compiler-independent build system generator

In software development, CMake is cross-platform free and open-source software for build automation, testing, packaging and installation of software by using a compiler-independent method. CMake is not a build system itself; it generates another system's build files. It supports directory hierarchies and applications that depend on multiple libraries. It can invoke native build environments such as Make, Qt Creator, Ninja, Android Studio, Apple's Xcode, and Microsoft Visual Studio. It has minimal dependencies, requiring only a C++ compiler on its own build system.

The Berkeley Packet Filter is a network tap and packet filter which permits computer network packets to be captured and filtered at the operating system level. It provides a raw interface to data link layers, permitting raw link-layer packets to be sent and received, and allows a userspace process to supply a filter program that specifies which packets it wants to receive. For example, a tcpdump process may want to receive only packets that initiate a TCP connection. BPF returns only packets that pass the filter that the process supplies. This avoids copying unwanted packets from the operating system kernel to the process, greatly improving performance. The filter program is in the form of instructions for a virtual machine, which are interpreted, or compiled into machine code by a just-in-time (JIT) mechanism and executed, in the kernel.

<span class="mw-page-title-main">Red (programming language)</span> Computer programming language released in 2011

Red is a programming language designed to overcome the limitations of the programming language Rebol. Red was introduced in 2011 by Nenad Rakočević, and is both an imperative and functional programming language. Its syntax and general usage overlaps that of the interpreted Rebol language.

Mingw-w64 is a free and open-source suite of developments tools that generate Portable Executable (PE) binaries for Microsoft Windows. It was forked in 2005–2010 from MinGW.

In computer programming, self-hosting is the use of a program as part of the toolchain or operating system that produces new versions of that same program—for example, a compiler that can compile its own source code. Self-hosting software is commonplace on personal computers and larger systems. Other programs that are typically self-hosting include kernels, assemblers, command-line interpreters and revision control software.

References

  1. "4.9 Canadian Crosses". CrossGCC. Archived from the original on October 9, 2004. Retrieved 2012-08-08. This is called a `Canadian Cross' because at the time a name was needed, Canada had three national parties.
  2. "Cross-Compilation (Automake)".
  3. "Cross compilation".
  4. "Obsolete Macintosh Computers". Archived from the original on 2008-02-26. Retrieved 2008-03-10.
  5. Aztec C
  6. Commodore 64
  7. Apple II
  8. MS-DOS Timeline Archived 2008-05-01 at the Wayback Machine
  9. Inside Windows CE (search for Fenwick)
  10. Microsoft Language Utility Version History
  11. History of PC based C-compilers Archived December 15, 2007, at the Wayback Machine
  12. Which Basic Versions Can CALL C, FORTRAN, Pascal, MASM
  13. "Free Pascal Supported Platform List". Platform List. Retrieved 2010-06-17. i386