Valhalla Legends Forums Archive | C/C++ Programming | Type casting or alternative

AuthorMessageTime
Yegg
Out of the two bits of code below, is one "better" (if so in anyway, explain) than the other, or does it boil down to preference? Assuming n is equal to 128845834980000000

[code]int x = (unsigned int *) n;[/code]

[code]int x = n % 4294967296;[/code]

First off, I understand that the cast in the first piece of code isn't required, though it keeps the compiler from warning about a possible overflow.
Both pieces of code will produce the same value. Is one better than the other? Why? Or is it just preference?
April 19, 2009, 6:27 AM
BreW
Well, the first bit of code would error on compile. you're trying to assign an int the value of an unsigned int *, which doesn't work out too well.
The second, assuming n is an __int64, would work.
April 19, 2009, 2:56 PM
Yegg
[quote author=brew link=topic=17913.msg182443#msg182443 date=1240152965]
Well, the first bit of code would error on compile. you're trying to assign an int the value of an unsigned int *, which doesn't work out too well.
The second, assuming n is an __int64, would work.
[/quote]

I ended up also asking this on another forum. I made a few errors on my part when I asked this question, a main one being that I never define n in my code. n is only used in asking this question to make it hopefully easier to read, but that didn't turn out to be the case. In the code, you would see int x = (unsigned int)128845834980000000. The asterisk wasn't supposed to be there, sorry.

As for the code not compiling? It compiles with and without the asterisk in both Apple gcc as well as Microsoft Visual C++ Express 2008 and I get the desired/expected results.

The other error I made was that I meant it to be unsigned int x, not just int x. Since n was not defined, and 128845834980000000 was used as an expression when declaring and defining x, I figure the compiler was able to convert it to the right type at compile-time.
April 19, 2009, 4:30 PM
BreW
[quote author=Yegg link=topic=17913.msg182446#msg182446 date=1240158628]
As for the code not compiling? It compiles with and without the asterisk in both Apple gcc as well as Microsoft Visual C++ Express 2008 and I get the desired/expected results.
[/quote]
I don't use either apple gcc or vc9. But I don't understand how it would compile, as you're explicitly changing an unsigned int into an unsigned int *, then trying to assign that value to an int. I would think it'd type error. Weird.
April 19, 2009, 6:18 PM
Yegg
[quote author=brew link=topic=17913.msg182448#msg182448 date=1240165093]
[quote author=Yegg link=topic=17913.msg182446#msg182446 date=1240158628]
As for the code not compiling? It compiles with and without the asterisk in both Apple gcc as well as Microsoft Visual C++ Express 2008 and I get the desired/expected results.
[/quote]
I don't use either apple gcc or vc9. But I don't understand how it would compile, as you're explicitly changing an unsigned int into an unsigned int *, then trying to assign that value to an int. I would think it'd type error. Weird.
[/quote]
A few people on Devshed forums (forums.devshed.com) said it would not compile as well, including one person who answers most of the C questions on the whole C programming forum over there. He also mentioned that I should set Apple gcc to -Wall -Werror. That will bring up all possible warnings.
What compiler are you using? Regular gcc? Something else?
April 19, 2009, 7:19 PM
BreW
[quote author=Yegg link=topic=17913.msg182451#msg182451 date=1240168790]
What compiler are you using? Regular gcc? Something else?
[/quote]
I use MSVC6 and regular gcc. Perhaps the compiler was smart enough to see that the cast was completely unnecessary since the two variables are of the same type, and removed it before it could error.
April 19, 2009, 10:50 PM
Yegg
[quote author=brew link=topic=17913.msg182452#msg182452 date=1240181419]
[quote author=Yegg link=topic=17913.msg182451#msg182451 date=1240168790]
What compiler are you using? Regular gcc? Something else?
[/quote]
I use MSVC6 and regular gcc. Perhaps the compiler was smart enough to see that the cast was completely unnecessary since the two variables are of the same type, and removed it before it could error.
[/quote]

That's what I kind of figured. Though I'll be changing some gcc settings to increase warnings so that I can stay even farther away from bad code.
April 20, 2009, 7:24 PM

Search