Peijen's Programming Thread 1.0
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
There. I changed it.VLSmooth wrote:I'll see about trying that out tomorrow, but the extra bool requirement is definitely hackish. In the end, I'm fairly certain I won't be using that approach.
Last edited by Jonathan on Wed Oct 22, 2003 11:45 pm, edited 1 time in total.
-
- Tenth Dan Procrastinator
- Posts: 3055
- Joined: Fri Jul 18, 2003 3:02 am
- Location: Varies
- Contact:
In conclusion, I really should've learned of the "make" training before doing any of this. A very nice make process and timing infrastructure already exists (make is a very convenient wrapper for gmake as well). Too bad I can't compile anything since a massive CVS conversion is taking place. Fun fun.
Last edited by VLSmooth on Tue Nov 18, 2003 10:22 pm, edited 1 time in total.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
I have a long running process currently taking up 100% CPU.
What happens if I copy over an object file that the binary for my process is dynamically linked against while the process is still running?
There are three cases:
Functionally equivalent object file
Slightly different object file (bugfix, doesn't affect functionality)
Completed different object file
My current thinking is that all three are the same, because the object file has been loaded into memory already, so it doesn't matter what happens on the disk. However, I could see that perhaps only some of the object file got loaded into memory (the portions that are currently used) so that if the running process had to access disk to get other parts of the object file, it would find a completely different object file and cause a crash. I'm not so sure what would happen if the slightly different object file case is considered if not all of the object file is loaded into memory. If the changes don't affect the process's functionality but, say, have the functions at a slightly different offset in the file, this may break the process too. Or, maybe all the linking is done via alias or name and the absolute positions don't matter as long as the names stay the same.
Anybody with more systems experience than me care to comment? This is on a Linux 2.4 platform.
What happens if I copy over an object file that the binary for my process is dynamically linked against while the process is still running?
There are three cases:
Functionally equivalent object file
Slightly different object file (bugfix, doesn't affect functionality)
Completed different object file
My current thinking is that all three are the same, because the object file has been loaded into memory already, so it doesn't matter what happens on the disk. However, I could see that perhaps only some of the object file got loaded into memory (the portions that are currently used) so that if the running process had to access disk to get other parts of the object file, it would find a completely different object file and cause a crash. I'm not so sure what would happen if the slightly different object file case is considered if not all of the object file is loaded into memory. If the changes don't affect the process's functionality but, say, have the functions at a slightly different offset in the file, this may break the process too. Or, maybe all the linking is done via alias or name and the absolute positions don't matter as long as the names stay the same.
Anybody with more systems experience than me care to comment? This is on a Linux 2.4 platform.
-
- Minion to the Exalted Pooh-Bah
- Posts: 2790
- Joined: Fri Jul 18, 2003 2:28 pm
- Location: Irvine, CA
I don't really have an answer to your question. But from what I have been doing in the last few month, I found myself tend to load all the data into memory when the program first loaded to have less number of disk access and more centralized location for code that load data from disk.
So I guess what I am saying is that it depends on how the program is written. If all the libraries are loaded as soon as the process starts then you shouldn't have too much problem. However if the libraries are loaded as needed then there is no telling what will happend: library already loaded, library is getting loaded, library has not been loaded.
I have no idea what linux kernel does and I have no idea what your process/compiler do either.
Couldn't you just version the libraries? I think Linux allows you to link different program to different version of library all running at once, but I could be talking out of my ass.
So I guess what I am saying is that it depends on how the program is written. If all the libraries are loaded as soon as the process starts then you shouldn't have too much problem. However if the libraries are loaded as needed then there is no telling what will happend: library already loaded, library is getting loaded, library has not been loaded.
I have no idea what linux kernel does and I have no idea what your process/compiler do either.
Couldn't you just version the libraries? I think Linux allows you to link different program to different version of library all running at once, but I could be talking out of my ass.
-
- Grand Pooh-Bah
- Posts: 6722
- Joined: Tue Sep 19, 2006 8:45 pm
- Location: Portland, OR
- Contact:
I was simplifying the issue. What is happening in reality is this: I have 338 processes running on that number of machines. I have realized there is a bug in one of my object files that affects some of those processes that has to be corrected. I killed all those processes, since the results were junk anyway. I am left with 222 processes which will produce good data. I want to keep whatever intermediate results they've calculated so I don't want to kill them. However, I need to start new processes to redo the calculations that I junked using the fixed object file. I was trying to figure out if I could just copy over the bad object file and restart my processes.
The way I solved this was by letting the currently running processes continue to run with the bad object file. I then redid everything else with the good object file saved in a different location. This way the two sets of runs won't stomp on each other. I still don't know if I could have copied over the object file, but I guess it doesn't matter at this time.
Man, I wish I had done this check on Sunday like I meant to. I had to delete 300~ results for a one line fix.
The way I solved this was by letting the currently running processes continue to run with the bad object file. I then redid everything else with the good object file saved in a different location. This way the two sets of runs won't stomp on each other. I still don't know if I could have copied over the object file, but I guess it doesn't matter at this time.
Man, I wish I had done this check on Sunday like I meant to. I had to delete 300~ results for a one line fix.