08-20-2023, 01:15 PM
(This post was last modified: 08-20-2023, 01:19 PM by mnrvovrfc.
Edit Reason: Sorry TL;DR
)
Calling compiler creators lazy is hard. Because development of a compiler is difficult to begin with. It's way easier than developing an operating system full with drivers and a visual environment.
I'd tell you one thing. As far as "gcc" is concerned, the program "ld" is a sorry excuse. Generally it just appends one object file to another and then does the necessary to make sure that mess could be executed, as program or as library. It has basic searching capabilities. It would have to unpack a "dot-a" file, or I think it summons "ar" for it.
EDIT: Otherwise "ld" requires precise instructions. Well so does the compiler but there are many more ways to get "ld" to refuse to create an executable file.
Previously I underestimated "LINK.EXE" and "QLINK.EXE" from M$ language products. Because I noticed what was going on with the Power C linker. All it did was search two "library" files after it appended the object files the programmer asked from the compiler.
Some compilers don't directly create object files, it is left to the assembler. So the assembler in some cases should be termed as lazy. The move from 16-bit to 32-bit to 64-bit was just to give slightly different names to the registers according to their size. To make it easy to transition from small to big bit depth, almost nothing was changed about code generation. Except having to deal with those ugly pseudo-assembler blocks in C and C++ programs.
EDIT: Take into account the many different CPU's that the assemblers have to generate code for. Take a short look at the "man pages" for "gcc", the compiler switches having to do with code generation specific to CPU's. There are like 50 different CPU's in there!
I'm just rambling. But creating a compiler and/or assembler isn't a small undertaking for a group of people who expect later to sit back and reap the rewards of their work. There was a lot of grousing from the AT&T employees who were trying to get the C programming language right. This was before many things had to be added to the language, especially function prototyping which made things more complicated for code generation but it was necessary to reduce the difficult bugs that resulted from calling functions improperly. Before "gcc" became "mature" there was almost no type-casting. Now you see it everywhere in C and C++ code. (Especially what QB64 has to generate for "g++" to compile.) Because it was another thing that must have been driving "lazy" programmers nuts and they refused to blame themselves for it, they blamed it on the compiler and assembler creators. There is still a lot of C "legacy" code that has never been fixed and the companies that own those vaults refuse to fix it, they complain it costs too much money.
I'd tell you one thing. As far as "gcc" is concerned, the program "ld" is a sorry excuse. Generally it just appends one object file to another and then does the necessary to make sure that mess could be executed, as program or as library. It has basic searching capabilities. It would have to unpack a "dot-a" file, or I think it summons "ar" for it.
EDIT: Otherwise "ld" requires precise instructions. Well so does the compiler but there are many more ways to get "ld" to refuse to create an executable file.
Previously I underestimated "LINK.EXE" and "QLINK.EXE" from M$ language products. Because I noticed what was going on with the Power C linker. All it did was search two "library" files after it appended the object files the programmer asked from the compiler.
Some compilers don't directly create object files, it is left to the assembler. So the assembler in some cases should be termed as lazy. The move from 16-bit to 32-bit to 64-bit was just to give slightly different names to the registers according to their size. To make it easy to transition from small to big bit depth, almost nothing was changed about code generation. Except having to deal with those ugly pseudo-assembler blocks in C and C++ programs.
EDIT: Take into account the many different CPU's that the assemblers have to generate code for. Take a short look at the "man pages" for "gcc", the compiler switches having to do with code generation specific to CPU's. There are like 50 different CPU's in there!
I'm just rambling. But creating a compiler and/or assembler isn't a small undertaking for a group of people who expect later to sit back and reap the rewards of their work. There was a lot of grousing from the AT&T employees who were trying to get the C programming language right. This was before many things had to be added to the language, especially function prototyping which made things more complicated for code generation but it was necessary to reduce the difficult bugs that resulted from calling functions improperly. Before "gcc" became "mature" there was almost no type-casting. Now you see it everywhere in C and C++ code. (Especially what QB64 has to generate for "g++" to compile.) Because it was another thing that must have been driving "lazy" programmers nuts and they refused to blame themselves for it, they blamed it on the compiler and assembler creators. There is still a lot of C "legacy" code that has never been fixed and the companies that own those vaults refuse to fix it, they complain it costs too much money.