Valhalla Legends Forums Archive | General Discussion | Why Windows crashes, and Linux doesn't

AuthorMessageTime
Mephisto
My friend and I had come up with an idea that perhaps the reason why Windows crashes more than Linux is that it runs off one file, Explorer.exe. So when Explorer gets mad and has some complications, your whole computer does and results in a crash. Because Linux runs off of several files, when one gets complications, your whole program doesn't end up crashing as Windows would. Also, this could be why Linux gets better uptimes because it doesn't rely on just one file like Windows does. Windows may just go off of one file since they want to keep their source closed, and managing one file is just easier than multiple. Maybe Linux runs faster because it has multiple files handling data and processing it. My friend and I don't really have a lot of evidence backing up this theory, and we're not 100% sure if Linux even does run off of multiple files, but if it's true, perhaps this why? Dunno... what do you guys think?
December 15, 2003, 3:02 AM
Skywing
I think this is a good candidate for the Fun Forum.
December 15, 2003, 3:09 AM
Grok
[quote author=Mephisto link=board=2;threadid=4245;start=0#msg35402 date=1071457364]
My friend and I had come up with an idea that perhaps the reason why Windows crashes more than Linux is that it runs off one file, Explorer.exe. So when Explorer gets mad and has some complications, your whole computer does and results in a crash. Because Linux runs off of several files, when one gets complications, your whole program doesn't end up crashing as Windows would. Also, this could be why Linux gets better uptimes because it doesn't rely on just one file like Windows does. Windows may just go off of one file since they want to keep their source closed, and managing one file is just easier than multiple. Maybe Linux runs faster because it has multiple files handling data and processing it. My friend and I don't really have a lot of evidence backing up this theory, and we're not 100% sure if Linux even does run off of multiple files, but if it's true, perhaps this why? Dunno... what do you guys think?
[/quote]

Am quoting it so he can't edit it away. My reply:

O.M.G.
December 15, 2003, 3:44 AM
Arta
:D
December 15, 2003, 4:14 AM
Stealth
Have a look at your C:\Windows\System\ or \System32\ folder, then come back here and tell us that Windows runs from a single file.
December 15, 2003, 5:10 AM
Kp
... that alone would have no effect on the stability. In both cases, the OS has to operate in privileged mode, which among other things means that it can pretty seriously ruin its own data structures and there's nobody there to stop it. Stability has nothing to do with number of unique files running in privileged mode and a great deal to do with whether and to what extent you use a "good" code base. That is, one which has been thoroughly debugged, does not fail under stress, etc. As another point, whether something is open source or not seems to have an effect on whether the code is good, but is by no means a deciding factor.
December 15, 2003, 5:40 AM
Denial
Wow, We need a kiddy section ! THis Post is no place for the fun forum
December 15, 2003, 6:00 AM
Adron
Well, I wouldn't say the basic idea is so wrong. His statement is incorrect because linux has a much more monolitic kernel than Windows - if anything, linux can run off fewer files when you compile everything into your kernel.

Other than that, if an OS is made up of independent parts that don't crash the kernel (MS seems to be making more things critical than linux does) then less failures will make it crash.

Then on the other hand, having fewer and smaller files makes an OS less likely to have bugs, with the same number of people working on it...
December 16, 2003, 8:30 AM
UserLoser.
Mephisto = Mrs.Zora = Xrph
December 17, 2003, 5:08 PM
Grok
[quote author=Adron link=board=2;threadid=4245;start=0#msg35683 date=1071563410]
Well, I wouldn't say the basic idea is so wrong. His statement is incorrect because linux has a much more monolitic kernel than Windows - if anything, linux can run off fewer files when you compile everything into your kernel.

Other than that, if an OS is made up of independent parts that don't crash the kernel (MS seems to be making more things critical than linux does) then less failures will make it crash.

Then on the other hand, having fewer and smaller files makes an OS less likely to have bugs, with the same number of people working on it...

[/quote]

Not exactly a complete statement. Studies by IBM have shown many different reasons for bugs being introduced into software. Having fewer files does reduce the number of bugs, but smaller files does not. The evidence suggests that related to procedure size, large procedures with hundreds of lines have less errors than many small procedures with dozens of lines. This is counterintuitive to how we think and program, but is supported by their studies. I prefer to write small functions for discrete manipulations and build larger blocks from those. For me this works well, and I feel I'm doing a better job. Perhaps for programmers in general, writing large procedures lets them keep more focus on the algorithm, and more details in front of them. If so, this is just support that they're sloppy function writers, not good, focused, programmers.

Edit: source material from "Code Complete"
December 17, 2003, 5:31 PM
St0rm.iD
Is that really xrph?
December 18, 2003, 12:39 AM
Null
LIES LIES ALL LIES

you dont have a friend ;D
December 23, 2003, 2:55 AM
Adron
Since this post was reawakened...

[quote author=Grok link=board=2;threadid=4245;start=0#msg36004 date=1071682301]
Having fewer files does reduce the number of bugs, but smaller files does not. The evidence suggests that related to procedure size, large procedures with hundreds of lines have less errors than many small procedures with dozens of lines.
[/quote]

Is this the sizes they refer to? Small is a few dozen lines and large is many hundreds? I was thinking more about file sizes than of procedure sizes - having 10-50k or 400-1000k source files. For small source files you can either have a large amount of small procedures or a few big ones. For large source files, you can either have a huge amount of small procedures or a large amount of big ones.

If the project is divided up into reusable pieces by source files, having the pieces be more limited and easily tested should be likely to reduce bugs. At least that's my experience from the huge nbbot.cpp.... :P
December 24, 2003, 2:02 AM
MrRaza
How many lines *does* nbbot.cpp have?
December 24, 2003, 2:26 AM
iago
Adron's absolutely right.. I was looking through piles and piles of .jsp and .java files yesterday, and the fact that there were tons of different files, each for doing different things, saved a lot of trouble.
December 24, 2003, 3:37 AM
Grok
Well since Adron's absolutely right, there's no need to reply.
December 24, 2003, 4:17 AM
iago
[quote author=Grok link=board=2;threadid=4245;start=15#msg36954 date=1072239468]
Well since Adron's absolutely right, there's no need to reply.
[/quote]

Yet you still felt the need to reply. Odd!
December 24, 2003, 4:31 AM
Grok
[quote author=iago link=board=2;threadid=4245;start=15#msg36956 date=1072240281]
[quote author=Grok link=board=2;threadid=4245;start=15#msg36954 date=1072239468]
Well since Adron's absolutely right, there's no need to reply.
[/quote]

Yet you still felt the need to reply. Odd!
[/quote]


.... to the issue.
December 24, 2003, 4:36 AM
Forged
[quote author=Mephisto link=board=2;threadid=4245;start=0#msg35402 date=1071457364]
My friend and I had come up with an idea that perhaps the reason why Windows crashes more than Linux is that it runs off one file, Explorer.exe. So when Explorer gets mad and has some complications, your whole computer does and results in a crash. Because Linux runs off of several files, when one gets complications, your whole program doesn't end up crashing as Windows would. Also, this could be why Linux gets better uptimes because it doesn't rely on just one file like Windows does. Windows may just go off of one file since they want to keep their source closed, and managing one file is just easier than multiple. Maybe Linux runs faster because it has multiple files handling data and processing it. My friend and I don't really have a lot of evidence backing up this theory, and we're not 100% sure if Linux even does run off of multiple files, but if it's true, perhaps this why? Dunno... what do you guys think?
[/quote]

Linux mainlly runs of one kernell. Microsoft runs off of hundreds of .dll files. If one of the many needed windows files becomes corrupt or experinces an error, it crashes. It requires a major fuck up to crash the single kernell. Kinda had it backwards on both accounts.
December 24, 2003, 5:00 AM
iago
kernel*

Linux also has .dll equivolants!
December 24, 2003, 12:36 PM
St0rm.iD
[quote author=Forged link=board=2;threadid=4245;start=15#msg36964 date=1072242050]
[quote author=Mephisto link=board=2;threadid=4245;start=0#msg35402 date=1071457364]
My friend and I had come up with an idea that perhaps the reason why Windows crashes more than Linux is that it runs off one file, Explorer.exe. So when Explorer gets mad and has some complications, your whole computer does and results in a crash. Because Linux runs off of several files, when one gets complications, your whole program doesn't end up crashing as Windows would. Also, this could be why Linux gets better uptimes because it doesn't rely on just one file like Windows does. Windows may just go off of one file since they want to keep their source closed, and managing one file is just easier than multiple. Maybe Linux runs faster because it has multiple files handling data and processing it. My friend and I don't really have a lot of evidence backing up this theory, and we're not 100% sure if Linux even does run off of multiple files, but if it's true, perhaps this why? Dunno... what do you guys think?
[/quote]

Linux mainlly runs of one kernell. Microsoft runs off of hundreds of .dll files. If one of the many needed windows files becomes corrupt or experinces an error, it crashes. It requires a major fuck up to crash the single kernell. Kinda had it backwards on both accounts.
[/quote]

Ugh, no.
December 24, 2003, 4:16 PM
Raven
[quote author=Forged link=board=2;threadid=4245;start=15#msg36964 date=1072242050]

If one of the many needed windows files becomes corrupt or experinces an error, it crashes.
[/quote]

Not true. Only if one of the crucial files (there aren't THAT many of them) is corrupt, it'll malfunction, but you'll still often be told that File %s is corrupt, go fix it plz. That is also the case with many other DLL files, in which the OS will function normally until the specific file is invoked, in which case once again, it'll tell you it's malfunctioning and you should go fix it.

As a bit of an addon to a previous statement, Explorer.exe is NOT a necessary file. You can endtask it, and Windows will still run normally, except that you will mostly lose the GUI desktop. However, you can still manually start processes. I sometimes do that if I want a particular operation running as smoothly as possible because Explorer is a resourcehog. ;)
December 24, 2003, 4:23 PM
St0rm.iD
Let's either let this drop, or make it funny. Don't answer intelligently, please.
December 24, 2003, 7:53 PM
Newby
"Don't answer intelligently, please."

i read in a linux form that all inwdows r bakcdroooerd the admin said so and somebody said no and he banned him therefore they really r bkcdoored omfg i </3 microsoft now

Hows that?
December 25, 2003, 3:18 PM
St0rm.iD
Perfect.
December 26, 2003, 8:34 PM

Search