Idea to bring back the old tpumover.exe

By | October 11, 2010

C++ has its static libraries and Delphi doesn’t have an exact equivalent. The nearest thing in Delphi is a *.dcp file. But there is a major difference between those two. A *.dcp file only contains bpl import stubs with meta data whereas a static library consists of one or multiple object files that contain the full code or dll import stubs. So a *.dcp file doesn’t allow you to put units into it without having a *.bpl file.

Why am I writing about that? I’m doing that because it has something to do with compile and code insight speed. The Delphi compiler has to search the disk for all the *.dcu and *.pas files in all the search-path paths. And it does that for every compile what can take a lot of time if you have a slow hard disk, especially in the morning when you’ve just started your computer and the file cache is empty or filled with the wrong information after an svn update. So I came up with the idea to write all the output *.dcu files to a single *.dcl file and reading from it when doing the next “make”. This way there is only one file I can map or load into memory, alter and write back to disk. I have thought about this for more than two years but never found the time to actually write a proof of concept. But today I remembered a tool that came with TurboPascal/BorlandPascal and was used to add *.tpu (the former *.dcu) to a *.tpl file that was automatically loaded by the compiler/IDE. Its name was tpumover.exe. All units in the turbo.tpl file where available to your project without specifying a search path.

After remembering how this tool worked I’d extended my idea to not only create one *.dcl file per project but allow to replace the search path by a list of *.dcl files. This way it would be possible to have a bdslib.dcl (all units from $(BDS)\lib), a Jedi.dcl and so on… And the compiler (or to be more precise: my compiler hook) would map those files into memory instead of loading them one by one from disk.  An index in those *.dcl files could make the compiler’s file search really fast. Especially because every *.pas search in the virtual *.dcl search paths would fail without using I/O.

dcumover.exe *.dcu bdslib.dcl
SearchPath=^$(BDSDCL)\lib\debug\bdslibdebug.dcl;^$(BDSDCL)\lib\bdslib.dcl;^$(BDSDCL)\jvcl.dcl

 

What do you think? Would it be worth to implement this?

20 thoughts on “Idea to bring back the old tpumover.exe

  1. Lachlan Gemmell

    The compiler is fast enough for me but I grow old waiting for Code Insight somedays. If it helped with Code Insight it would be definitely worthwhile.

  2. Mason Wheeler

    Agreeing with Lachlan here, on both points. If you can speed up *Insight with this, I’d definitely be interested…

  3. Rodrigo Farias Rezino

    The project that I word is really big.
    Always when I need to compile this I want to cry.
    I think it could be a VERY important tool to us.

  4. Steven Kamradt

    Sounds to me like it would be worth implementing if it would decrease the amount of time waiting for code insight when working with non-trivial projects containing a large number of units.

  5. Jolyon Smith

    @Rodrigo:

    When you are compiling only the changed units should be compiled.

    When you are building you would want to make sure that all (source) units are recompiled. The majority of your pre-compiled DCU libraries would need to be recompiled anyway in those cases.

    Any time you change a conditional define symbol or a compiler option or potentially even an IDE library path setting, your pre-compiled libraries would be out of date.

    I can’t quite believe people are talking about “slow hard disks” in this day and age. A far more likely contributor to poor compile times is poor project organisation in the first place.

    Excessively large units, unnecessary units “used” in the interface, units that use units which use the units that use them (possible if using from the implementation section) … these are great ways to bog the compiler down.

  6. Joe

    Insight is a pain in the neck but compile times are not so bad. I don’t think it is worth the effort, I guess only a handful of people would use it if any. I would prefer to see your precious time spent on other Delphi things like performance and bug fixing

  7. Andreas Hausladen Post author

    @Jolyon Smith: “Any time you change a conditional define”: Do you recompile the RTL/VCL if you change a define in your project? I doubt that. And even if you have a copy of a RTL/VCL unit in your project directory the compiler will take your copy because it still works through the search paths, but if it hits a ^filename.dcl “path” it looks into the dcl file.

    “The majority of your pre-compiled DCU libraries would need to be recompiled anyway in those cases.”
    Wrong. I intend to “pack” the DCU files that do _not_ change every day. That’s why I gave the examples of bdslib.dcl, jedi.dcl. Those are RTL/VCL and components. And they are relativley stable (unless you modify and update all the time, but then you simply don’t pack them into a dcl).

    “I can’t quite believe people are talking about “slow hard disks””:
    Not everybody has a SSD disk. And even then the HDD is still the slowest component in the computer (compared to RAM).

  8. Stéphane Wierzbicki

    Great idea, I really hope to test this new feature as soon as possible 🙂

  9. Charles Ardour

    Could it be possible to somehow use RAM above 32bit limit as memory cache for these “dcl” files? E.g. 64bit version compiled with FPC and interfacing with 32bit stub. I have development machine Win7 x64 with 8GB RAM and Im always in pain when I see that Delphi cannot use more memory for better caching… Until we someday see x64 compiler and IDE compatible with Windows 8 or 9 ;-} …

    I have yet to find some nice RAM disk software which could solve some of problems with file access in Delphi IDE, so yours idea is nice second interesting solution.


    For the structure of DPL files – maybe You can use some known file-based database engine (SQLite?} so DPL can be relatively easy and user friendly to create/update/view.

    ChAr

  10. Roland

    Hi Andi

    We were very interested in that feature. Our Project has more than 17000 Codefiles and the Compile Time of the whole Project is ~ 8 min. Code Insight is deactivated, because Delphi needs too much time to come back. We wait for your tool ::

  11. Xepol

    This would be a HUGE boon to third party component development and distribution. HOWEVER, the amalgamated file should also contain any relevant .RES and .DFM files as well otherwise you still have the problem of multiple files.

    It would, however, have to be used with caution. QR Reports which ships with many versions of Delphi has traditionally been a pain for those who upgraded to the full version. And when you want to replace the stock INDY components with a newer version, again you currently have to track down and remove the current compiled units by hand. This has gotten easier as time goes, but it does underscore the need to encourage people to use it responsibly AND the ability to remove selected files from the .tpl later.

  12. Jolyon Smith

    @Andreas: No, the VCL doesn’t get recompiled (apart from a handful of units which I/we have been forced to modify to fix some bugs). But then I haven’t measured the time the VCL contributes to a compile (more accurately, a link) since it hasn’t been significant enough to be worth measuring.

    And SSD or no SSD, hard disks may be the slowest component in a computer, but that’s like saying a Lotus is the slowest car in a F1 race. It’s still damned quick, and much quicker than F1 cars of years past.

    I guess the problem is, I work in a team. I may not have changed a file but when I update from the repository I need to be sure that when I compile/build I am doing so with everybody’s changes.

    If I had DCU libraries they wouldn’t actually save me much time because they would simply add a NEW “must do” step to my procedures… after updating from the repository, rebuild all DCU libraries before rebuilding the project.

    A step which, if forgotten, potentially “fails” silently without my being aware that I forgot to do it.

    So if such a feature were to be provided, to make it useful (i.e. not dangerous) in such situations the compiler would (should) need to have some mechanism to automatically check at compile time that the DCU libs are up to date with their respective source units, at which point such checks would, I am guessing, reduce any “improvement” over not having DCU libs in the first place to virtually nothing.

    As with so many things, when there is something that used to be but which no longer is, we often forget the good reasons why it (perhaps) ceased to be and wish simply for the Good Old Days.

    Maybe this feature was outgrown when Turbo Pascal grew up and left the one-man hobby shed/bedroom behind ?

    Just asking.

  13. Andreas Hausladen Post author

    “need to have some mechanism to automatically check at compile time that the DCU libs are up to date”:
    The “reading” in “and reading from it when doing the next “make”” means more than just read from the dcl file. The dcl file is a small database that not only contains the filename and content but also a time stamp. The dcl file has to provide everything that a normal GetFileAttributesEx and FindFirstFile API call can return. Otherwise I would have to fill the attributes and time stamps with random data. And I definitely wouldn’t do that.

    “when there is something that used to be but which no longer is, we often forget the good reasons why it (perhaps) ceased to be”:
    With Delphi 1 the RTL grew a lot and the VCL was added. Now try to load them all into memory in a 16 bit world.

    Could it be that you have all source directories in your search path, so the compiler compiles every single line of code that ends up in your EXE/BPL files? If so, this tool is definitely not for you. But if you have library units (rtl, vcl, 3rd party components) that the compiler can’t automatically recompile because it doesn’t have the source directories to those in its search path, this tool might help you to speed up the compiler’s I/O operations (what isn’t verified yet). And please don’t underestimate what the compiler does to your hard disk when it searches for units. Especially if you have more than a hand full of directories in your search path. (ProcessMonitor is your friend if you want to find that out).

  14. Jolyon Smith

    @Andreas:

    “Could it be that you have all source directories in your search path, so the compiler compiles every single line of code that ends up in your EXE/BPL files? If so, this tool is definitely not for you”

    Not *every* source file is on the path, no. However, the vast (overwhelming) majority of source files are explicitly listed in the DPR (complete with location, using the “unitname in ‘relpath\unitname.pas’,” syntax).

    Could this explain why compile times are swifter than a swift thing going as swiftly as it can very swiftly and why insight remains tolerable? The question is, if that can work for me/us with a 2m LOC project, why do other people (seem to) have such trouble with much smaller projects ?

    And to answer your question, no I haven’t peered under the skirts of the compiler with ProcessMonitor … why would I when I don’t have any issues with it’s performance ?

  15. Mark

    Hi, I think the idea is Excellent.
    We have all the 3rd party components in our version control (ClearCase). This puts them on a network share. The IDE and the compiler always access thousands of times on the network to look at the files. That may still be ok for small projects (a few seconds), but we also have larger projects (more then 1000 sources), then they need sometimes over 5 minutes to compile it.
    We have over 40 Delphi development, more than half of them complain waiting times in the IDE (Code Insight) and the compiler.
    To copy the data to local disk is no alternative, since each project has its own (version) of the various components.
    Monster as DevExpress with more than 1000 files does not make it faster ;-(
    Greetings
    Mark

  16. Mason Wheeler

    @Jolyon: Wait, are you saying that you don’t automatically do a full build after checkouts anyway? At work, when I come in in the morning I check out, rebuild the client and middle tier servers, and rebuild component packages if anything from the checkouts involved the libraries. If Andy could make it so that building a package also produced the static library via a compiler hook of some sort, this wouldn’t really add a new step.

  17. Dorin Duminica

    In the end it’s your choice Andy, but I would give it a „go¯ at least for a testing purpose.
    All my Delphi IDE are running from a VM(slower compile time, slower disk IO, etc.) and I can see that something is fishy when Delphi reads from disk… I would allocate at least 1 GB in RAM for your tool.
    Can’t wait to give it a go!!

  18. Arioch

    1) in Turbo Pascal 5 the tpumover was TUI tool like Norton Commander (in modern – like 7-zip.org with two panels by F9 hotkey)
    It was nicer than cmdine-ony tpumover from TP6+ (and it also supported cmdline functioning, not TUI-only)

    DCU Mover might be nice tool but i also hope it will have user graphical ui.

    2) what about VCLFix unit and Dephi XE ? is it no more needed ? or would it be updated ?

    Thank you.

Comments are closed.