[SOLVED] Prepending to a multi-gigabyte file


What would be the most performant way to prepend a single character to a multi-gigabyte file (in my practical case, a 40GB file).

There is no limitation on the implementation to do this. Meaning it can be through a tool, a shell script, a program in any programming language, …


There is no really simple solution. There are no system calls to prepend data, only append or rewrite.

But depending on what you’re doing with the file, you may get away with tricks.
If the file is used sequentially, you could make a named pipe and put cat onecharfile.txt bigfile > namedpipe and then use “namedpipe” as file. The same can be achieved by cat onecharfile.txt bigfile | program if your program takes stdin as input.

For random access a FUSE filesystem could be done, but probably waay too complicated for this.

If you want to get your hands really dirty, figure out howto

  • allocate a datablock (about inode and datablock structure)
  • insert it into a file’s chain as second block (or first and then you’re practically done)
  • write the beginning of file into that block
  • write the single character as first in file
  • mark first block as if it uses only one byte of available payload (this is possible for last block, I don’t know if it’s possible for blocks in middle of file chain).

This has possibilities to majorly wreck your filesystem though, so not recommended; good fun.

Answered By – Pasi Savolainen

Answer Checked By – Marie Seifert (BugsFixing Admin)

Leave a Reply

Your email address will not be published. Required fields are marked *