Valhalla Legends Forums Archive | General Programming | 32-bit programming coming to an end?

AuthorMessageTime
DVX
i've been hearing a lot of stuff coming up about 64-bit processors, microsoft os's 64-bit processors, etc.

this got me to thinking that is 32-bit programming coming to an end and 64-bit programming coming next very soon?

also, with longhorn and microsoft abandoning the win32 api (so i have heard), is this sort of an attempt to get us to move towards 64-bit programming? (while microsoft is abandoning win32 api, they are making api's for .net now)

this is just what i have heard, none of it is really factual.. any comments?
February 3, 2004, 7:25 PM
Adron
You can still run 16-bit programs in Windows. It's not likely that 32-bit programming will be abandoned any time soon, even though perhaps there might eventually be a reduction in API support and development for it.
February 3, 2004, 9:50 PM
j0k3r
Seeing as how this is in general, can someone expand on what this means? Are 64bit processors/programs just faster than 32bit processors/programs? Or is there a whole new layer to be revealed?
February 3, 2004, 10:33 PM
Adron
They have 64-bit general purpose registers and 64-bit addressing modes.

What this mostly means is that they can handle more memory. If you're going to put more than like 2-4 GB ram in your computer, you might want to look at a 64-bit processor.

Most operations won't automatically be much faster. Today's processors already have wider memory buses than 32-bits. And it's not that often that you really need 64-bit variables instead of 32-bit ones. And using bigger variables than needed uses up more memory bandwidth...
February 3, 2004, 11:21 PM
Myndfyr
Yes... as I recall, a 64-bit processor means that all of the internal and external architecture is 64-bit: the ALU(s), the address registers, the memory bus itself, etc. This means that the processor can address more memory overall (32-bit processors can address up to 4gb natively; a 64-bit processor can address up to 2^64, or approximately16,000,000,000,000,000,000 bytes. It also means that there is native support for addition of 64-bit data types, and that the memory bus can transfer more data per block movement operation than a 32-bit processor can.

That said, unless Microsoft has really shaped up its support for consumer use of 64-bit technology, I don't foresee 64-bit as a platform coming up to "common usage" until at least 2008, my prediction for when the next edition of Winblows (after Longhorn). If you look at Windows XP 64-bit edition, nothing is supported: no internet connection sharing, no network bridge, no windows media, the list goes on.

For now, and I'm guessing it will remain the same through Longhorn, there isn't enough counsumer need to have a 64-bit PC. The technology will stay in the realm of scientists and animators who already have specialized software to run on the platform, and who will see a true performance boost out of it.
February 6, 2004, 5:20 PM
Adron
An important thing to note today is that the memory bus size of a 32-bit processor can be much greater than 32 bits. So in most cases you won't automatically get higher memory bandwidth with a 64-bit cpu.
February 6, 2004, 6:18 PM
Myndfyr
[quote author=Adron link=board=5;threadid=5064;start=0#msg42805 date=1076091532]
An important thing to note today is that the memory bus size of a 32-bit processor can be much greater than 32 bits. So in most cases you won't automatically get higher memory bandwidth with a 64-bit cpu.
[/quote]

Well no, but at any given time a given 32-bit processor can only address up to 4gb of memory. If you're using some kind of hardware paging, I suppose you could address beyond that. The address lines are what I was referring to, not the actual data bus.
February 6, 2004, 6:26 PM
Adron
[quote author=Myndfyre link=board=5;threadid=5064;start=0#msg42808 date=1076091962]
[quote author=Adron link=board=5;threadid=5064;start=0#msg42805 date=1076091532]
An important thing to note today is that the memory bus size of a 32-bit processor can be much greater than 32 bits. So in most cases you won't automatically get higher memory bandwidth with a 64-bit cpu.
[/quote]

Well no, but at any given time a given 32-bit processor can only address up to 4gb of memory. If you're using some kind of hardware paging, I suppose you could address beyond that. The address lines are what I was referring to, not the actual data bus.
[/quote]

Oh, ok. I interpreted this

[quote]
and that the memory bus can transfer more data per block movement operation than a 32-bit processor can.
[/quote]

to mean that the data lines of the memory bus of a 64-bit processor would be wider than those of a 32-bit processor which isn't necessarily true.

There is also the relatively common "PAE" hardware paging that supports addressing more than 4 GB of memory on a 32-bit cpu.
February 6, 2004, 7:33 PM
MesiaH
now amd goes around saying one constant thing "run your 32 bit applications without any loss in performance" blah de dah, but does that really make a difference? because if it does, then why would amd be developing 64 bit processers ground up for microsoft, and there new 64 bit applications/operating systems, if microsoft was just going to abandon 32 bit api?

But what im thinking, is the 64 bit processer would automatically handle 32 bit applications, they just say that to make there stuff look good and secure..
February 7, 2004, 8:19 AM
tA-Kane
Being a 64-bit processor would mean that the instructions would be 64-bits in length, instead of 32-bits in length. If a 64-bit processor is thusly expecting that, then it may require conversion of 32-bit instructions into the native 64-bit instructions, which would result in a slight slowdown.

If this is not the case (and even if it is), how would 64-bit processors run 32-bit instructions without slowdown?
February 7, 2004, 10:22 PM
Myndfyr
[quote author=tA-Kane link=board=5;threadid=5064;start=0#msg42998 date=1076192575]
Being a 64-bit processor would mean that the instructions would be 64-bits in length, instead of 32-bits in length. If a 64-bit processor is thusly expecting that, then it may require conversion of 32-bit instructions into the native 64-bit instructions, which would result in a slight slowdown.

If this is not the case (and even if it is), how would 64-bit processors run 32-bit instructions without slowdown?
[/quote]

You might recall that the 8086 was touted as the world's first 20-bit processor. It wasn't really 20-bit, it only had 20 address lines. It was really a 16-bit processor. The internal architecture was 16-bit.

In any case, 16-bit processor instructions are all provided in the 32-bit iX86 chips. IIRC, NOP is E5 (I'm probably wrong.... I hated assembler and never want to see it ever again). It will still be E5 (if that's what it is now) on the 64-bit chips. There isn't a special "64-bit instruction set" -- there will be instructions encoded that may be longer than 32 bits for any new instructions that they add, but even new instructions don't have to be beyond 32 bits.

But when you're using a 32-bit program on a 64-bit machine, you won't be using the full potential of the system bus, the data bus, or the internal architecture, because the compiled program will be designed for different, smaller data types. That's why it's advantageous to recompile a 16 bit application for a 32-bit environment; you'll see performance gains, especially in areas of block transfers and such.
February 8, 2004, 9:02 AM
Adron
[quote author=tA-Kane link=board=5;threadid=5064;start=0#msg42998 date=1076192575]
Being a 64-bit processor would mean that the instructions would be 64-bits in length, instead of 32-bits in length. If a 64-bit processor is thusly expecting that, then it may require conversion of 32-bit instructions into the native 64-bit instructions, which would result in a slight slowdown.
[/quote]

The length of the instructions are probably varying. x86 instructions have very different lengths, they're not all 32 bits long.

Some 64-bit cpus seem to use an x86-like instruction/register set (AMD's?). In that case, the switch can be based on a mechanism like the switch from 16- to 32-bit; setting the processor to a default mode and using prefix bytes to make exceptions.

Some 64-bit cpus seem to use a completely different instruction/register set (Intel's?). In that case you can't readily mix 32-bit and 64-bit instructions in the same application, and I suppose that yes, a slowdown for executing 32-bit instructions could be expected.
February 8, 2004, 11:24 AM
Puzzle
I don't think Intel have released their 64-bit processor yet, have they?

I know Longhorn is being tested on the AMD processor since Intel hasn't finished theirs, or at least hadn't last time I heard.
March 7, 2004, 1:55 AM
Grok
[quote author=Puzzle link=board=5;threadid=5064;start=0#msg47999 date=1078624529]
I don't think Intel have released their 64-bit processor yet, have they?

I know Longhorn is being tested on the AMD processor since Intel hasn't finished theirs, or at least hadn't last time I heard.
[/quote]

I believe Merced was the first IA-64 processor.

64-bit extension technology is one of a number of innovations being added to Intel's IA-32 Server/Workstation platforms in 2004. It represents a natural addition to Intel's IA-32 architecture, allowing platforms to access larger amounts of memory . Processors with 64-bit extension technology will support 64-bit extended operating systems from Microsoft, Red Hat and SuSE. Processors running in legacy* mode remain fully compatible with today's existing 32-bit applications and operating systems.

FAQ: http://www.intel.com/technology/64bitextensions/faq.htm
March 7, 2004, 2:45 AM
R.a.B.B.i.T
cipher:
A cryptographic system in which units of plain text of regular length, usually letters, are arbitrarily transposed or substituted according to a predetermined code.

encryption:
1. To put into code or cipher. (Hrm....)
2. Computer Science. To alter (a file, for example) using a secret code so as to be unintelligible to unauthorized parties.


Hence: telling somebody how to decrypt something automatically defies the definition of encryption, and the file is no longer encrypted, just hard to read.
March 13, 2004, 12:57 AM
o.OV
[quote author=R.a.B.B.i.T link=board=5;threadid=5064;start=0#msg49132 date=1079139478]
cipher:
A cryptographic system in which units of plain text of regular length, usually letters, are arbitrarily transposed or substituted according to a predetermined code.

encryption:
1. To put into code or cipher. (Hrm....)
2. Computer Science. To alter (a file, for example) using a secret code so as to be unintelligible to unauthorized parties.


Hence: telling somebody how to decrypt something automatically defies the definition of encryption, and the file is no longer encrypted, just hard to read.
[/quote]

o_o where did that come from
March 13, 2004, 1:48 AM
Grok
I disagree, and so do the world's cryptoanalysts.

Telling someone how to decrypt a message, and them still being unable to do so, merely helps prove the algorithm for encryption is a good one. In the cryptoanalyst world, nobody would rely on an algorithm that the rest of the world had not tried to crack. It's the secret key which is withheld, only.
March 13, 2004, 4:17 AM
j0k3r
[quote author=Grok link=board=5;threadid=5064;start=15#msg49170 date=1079151426]
I disagree, and so do the world's cryptoanalysts.

Telling someone how to decrypt a message, and them still being unable to do so, merely helps prove the algorithm for encryption is a good one. In the cryptoanalyst world, nobody would rely on an algorithm that the rest of the world had not tried to crack. It's the secret key which is withheld, only.
[/quote]
Person 1: "How do I crack this?"
Person 2: "Here I'll tell you"
Person 2 tells Person 1
Person 1: "Ok what's the secret key? I thought you were going to tell me how to crack it."
Person 2: "Oh, well I can't tell you that"

You're not telling them anything without the key, so claiming you told them how to break it is a lie. You can't tell someone to open your lock, and then when they ask how, you tell them "Well 2 turns to the right until a certain number, one to the left then land on the number, and then to the right until another number" and claim you told them how to open it.
March 13, 2004, 4:41 AM
Grok
How to decrypt and being able to use that knowledge are distinct notions in cryptoanalysis.

I would not feel safe behind an encryption algorithm just because you did not know the algorithm. I would much rather you know the algorithm (how to decrypt a message), but still not be able to because you did not have the private key.

When you said "you're not telling them anything without the key" is accurate, but the second part, "so claiming you told them how to break it is a lie" is still wrong.

"Security through obscurity is no security at all" is the golden tenet of security.
March 13, 2004, 10:14 AM
j0k3r
You can't break it without the key, so they are not able to break it. Have you really told them how to crack it? No, telling them would include the key so that they are able to do it. Anyways I think that it's just a technicality in that saying.

What's the point of using an algorithm at all if the only security is the secret key, or if you're going to release the algorithm?
March 13, 2004, 3:56 PM
Kp
[quote author=j0k3r link=board=5;threadid=5064;start=15#msg49205 date=1079193365]What's the point of using an algorithm at all if the only security is the secret key, or if you're going to release the algorithm?[/quote]

Well, most of the popular algorithms are publicly known and have been independently confirmed many times over that you can't break it without the secret key. Having the algorithm out there lets others confirm for you that it really will hold up against a "modest" brute force attack (e.g. someone tries a few thousand combinations for the key and gives up). An algorithm which relies upon no one knowing what it did to the text is broken as soon as a client capable of decrypting it gets out. As long as no one has the client, they have no idea what it's done to the code (unless they do exhaustive analysis of ciphertexts and associated cleartexts, but that could be very difficult if the algorithm is even moderately complex). However, as soon as you get the client and see that it just takes the <math operation> of the ciphertext to get the cleartext, you can immediately break down all messages sent with that algorithm. If a secret key was fed into the operation as well, you'd need to acquire the ciphertext, the algorithm, and the secret key. Secret keys are, well, secret, so that's where the security comes from.
March 13, 2004, 4:04 PM
K
Exactly. I will tell you that I'm using Triple DES or whatnot, and unless you can solve discrete logarithms via some technique no one else knows, I'll feel pretty safe.
March 15, 2004, 8:42 PM

Search