x86_64 Linux C/C++ Test

I use:
gcc
cmake
KDevelop
RapidSVN
Meld

However, don’t go the websites, all of these are available in the (Default?) repositories, so you can either install them via yum, PackageKit or apt-get. Also note: RapidSVN and Meld are only needed if you want to use SVN. Even KDevelop is not required if you have another text editor that you prefer such as gedit/vi/emacs. If you want to create your provide your own make file then you don’t need cmake either.

Anyway, so a simple application that just tests that you can do a 64 bit compile is pretty straight forward.
1) Create your main.cpp file with a int main(int argc, char* argv[]); in it.

# Set the minimum cmake version
cmake_minimum_required (VERSION 2.6)
 
# Set the project name
project (size_test)
 
# Add executable called "size_test" that is built from the source file
# "main.cpp". The extensions are automatically found.
add_executable (size_test main.cpp)

2) Create a CMakeLists.txt that includes your main.cpp.

#include <iostream>
 
int main(int argc, char* argv[])
{
  int *int_ptr;
  void *void_ptr;
  int (*funct_ptr)(void);
 
  std::cout<<"sizeof(char):        "<<sizeof(char)<<" bytes"<<std::endl;
  std::cout<<"sizeof(short):       "<<sizeof(short)<<" bytes"<<std::endl;
  std::cout<<"sizeof(int):         "<<sizeof(int)<<" bytes"<<std::endl;
  std::cout<<"sizeof(long):        "<<sizeof(long)<<" bytes"<<std::endl;
  std::cout<<"sizeof(long long):   "<<sizeof(long long)<<" bytes"<<std::endl;
  std::cout<<"------------------------------"<<std::endl;
  std::cout<<"sizeof(float):       "<<sizeof(float)<<" bytes"<<std::endl;
  std::cout<<"sizeof(double):      "<<sizeof(double)<<" bytes"<<std::endl;
  std::cout<<"sizeof(long double): "<<sizeof(long double)<<" bytes"<<std::endl;
  std::cout<<"------------------------------"<<std::endl;
  std::cout<<"sizeof(*int):        "<<sizeof(int_ptr)<<" bytes"<<std::endl;
  std::cout<<"sizeof(*void):       "<<sizeof(void_ptr)<<" bytes"<<std::endl;
  std::cout<<"sizeof(*function):   "<<sizeof(funct_ptr)<<" bytes"<<std::endl;
  std::cout<<"------------------------------"<<std::endl;
  std::cout<<"Architecture:        "<<sizeof(void_ptr)<<" bit"<<std::endl;
 
  return 0;
}

3) cd to the directory of your CMakeLists.txt and run “cmake .” and then “make”
4) ./size_test output:

sizeof(char):        1 bytes
sizeof(short):       2 bytes
sizeof(int):         4 bytes
sizeof(long):        8 bytes
sizeof(long long):   8 bytes
------------------------------
sizeof(float):       4 bytes
sizeof(double):      8 bytes
sizeof(long double): 16 bytes
------------------------------
sizeof(*int):        8 bytes
sizeof(*void):       8 bytes
sizeof(*function):   8 bytes
------------------------------
Architecture:        64 bit

As you can see this is specific to x86_64. The beauty of gcc is that by default it compiles to the architecture it is being run on. I had previously thought that it would be a world of pain, making sure that my compiler built the right executable code and linked in the correct libaries. I know this project doesn’t use any special libraries, but (because of cmake?) the process is exactly the same as using cmake under 32 bit to make 32 bit executables. You just make sure that they are there using Find*.cmake and then add them to the link step:

SET(LIBRARIES
  ALUT
  OpenAL
  GLU
  SDL
  SDL_image
  SDL_net
  SDL_ttf
)
# Some of the libraries have different names than their Find*.cmake name
SET(LIBRARIES_LINKED
  alut
  openal
  GLU
  SDL
  SDL_image
  SDL_net
  SDL_ttf
)
FOREACH(LIBRARY_FILE ${LIBRARIES})
  Find_Package(${LIBRARY_FILE} REQUIRED)
ENDFOREACH(LIBRARY_FILE)
 
# Link our libraries into our executable
TARGET_LINK_LIBRARIES(${PROJECT_NAME} ${LIBRARIES_LINKED})

Note that we don’t actually have to specify the architecture for each package or even the whole executable. This is taken care of by cmake. Anyway, it is not some mysterious black magic, it is exactly the same as you’ve always been doing. Cross compiling is slightly different, but basically you would just specify -m32 and make sure that you link against the 32 bit libraries instead. If I actually bother creating another 32 bit executable in my life I’ll make sure that I document right here exactly how to do a cross compile from 64 bit.

The advantages of 64 bit are mmm, not so great unless you deal with really big files/memory ie. more than 4 GB. Perhaps more practical are the extra and upgraded to 64 bit registers so you may see an increase in speed or parallelisation of 64 bit operations, for example a game may want to use 64 bit colours (ie. 4x 16 bit floats instead of 4x 8 bit ints to represent an rgba pixel.

Things to watch out for:
int is still 32 bit! If I were implementing the x86_64 version of C++ standard/gcc/in a perfect world this would have been changed to 64 bit, ie. “int” would be your native int size, it’s logical, it makes sense. However, I do understand that this would have broken a lot of code. The problem is, if int != architecture bits, then why have it at all, why not drop it from the standard and just have int32_t and int64_t and be done with it. Then if a program chooses it can have:

typedef int32_t int;

or

typedef int64_t int;

as it sees fit. Anyway.
Pointers are now 64 bit! So you can use them with size_t but cannot use them with int
Under linux x86_64 gcc sizeof(int*) == sizeof(function*), however, this is not guaranteed anywhere. It may change on a certain platform/compiler. Don’t do stuff like this:

int address = &originalvariable;

gcc should warn you (you may need to turn on these warnings -Wformat -Wconversion).

All in all, if you have been writing standard code and using make/cmake it should be relatively pain free to upgrade to 64 bit.

Linux x86_64

I have been dipping my toe into x86_64 waters sporadically over the last couple of years. On each of the previous occasions it always seemed too immature, packages were way to hard to come by (I prefer precompiled binaries), half my hardware didn’t work, strange crashes etc. Seeing as this episode has been 100% successful, I thought this time I would document it.

Fedora
Fedora

My favourite distribution is Fedora due to it’s rapid development and ease of use. I downloaded via BitTorrent. (Obviously) make sure you get the x86_64 version. I always like to run the sha checksum to rule out that as the problem if something does arise later. I also make sure that my DVD verifies in my burning program after it has been burnt.

Now we are ready to install. Unless you have something really exotic you should not need any special drivers or anything (At least not until after the install), it should just work. The important parts of my hardware are:
Asus A8V-E SE (Not awesome, my awesome motherboard blew up causing me to downgrade to this one I had lying around) AMD Socket 939
AMD Athlon 64 X2 4800+ CPU
nVidia GeForce 8600 GT 256MB PCIe

I use the onboard LAN and sound card, as well as 2 SATA drives, an IDE drive and an IDE CDROM.

So I installed Fedora from the DVD. You can again choose to verify the media, weirdly (And in previous versions as well) this check always seems to fail even though the sha check and burning software verification succeed, so either the check is broken or the motherboard/drive is broken. I have never seen this verification succeed in my life. Anyway, I skip it now and the options I select (At appropriate times) are fresh install onto a blank drive, “Software Development” profile/packages (You can probably turn off the other profiles, you can install any required packages individually later on when you are in the OS anyway). Next time I do an install I would love to try an upgrade install.

That should all install (You don’t have to get too serious about selecting the right packages right now, I find it easier to install “generally” what I need (“Software Development”) and then customise later) and you should now be logging into a fresh install of Fedora 10.

Initially I had some problems with an additional PCI sound card I had present due out of habit because I had never gotten my on board sound to work for any motherboard under Linux. Some programs were using the onboard and some where then using the PCI one, so I rebooted and went into the bios to disable the onboard one. Both still get detected. Apparently this is a common problem with this motherboard. I went to update the bios and wouldn’t you believe it, the bios updater is Windows only. Anyway, because the onboard sound card was being detected I just removed the PCI one and enabled the onboard one again. That fixed it up awesomely and I had audio, yay. Also removing PulseAudio can “unconfuse” applications and force them to use ALSA,
yum remove alsa-plugins-pulseaudio

I then noticed that I had some issues with audio playback stuttering, cycling through normal speed and then fast for a second and then normal again. I fixed it by following this tutorial.

Add the Livna repo by downloading and running the add repo rpm, it is not linked to on the main page, but the url can be built from the other releases. Add the RPMFusion repo by downloading and running both the free and non-free add repo rpm.
For my information:
RPMFusion provides additional packages that are not in the base Fedora repos.
Livna provides the same packages as RPMFusion, but also provides the libdvdcss package for watching DVDs.

I have never had much luck with ATI drivers for Linux. I had heard the nVidia ones were easier to install and configure and apparently faster to boot. Before you install drivers, you might want to get a benchmark of your FPS in glxgears before installation:
glxgears

I downloaded and installed the nVidia (Binary, proprietary) driver:
sudo yum install kmod-nvidia

Now reboot (It’s the easiest way to restart X). Test that hardware accelerate rendering is happening by looking for in the output of this command:
glxinfo | grep direct

And your glxgears FPS should be above 2000:
glxgears

Adobe recently released an x86_64 Linux version of Flash so we don’t have to mess around with nswrapper etc. any more. I downloaded it from here, extracted, su, cp ./libflashplayer.so /usr/lib64/mozilla/plugins, restarted Firefox. You may want to test it also.

Nexuiz
Nexuiz

For my benefit for next time, I also like:
Neverball and Neverputt
VDrift
Torcs
Nexuiz
Open Arena
Urban Terror
XMoto

Transmission
Thunderbird
Rhythmbox
VLC
K3B
Wine
CMake
KDevelop

I have not provided any links to these as they are all present in PackageKit which comes with Fedora 10.

Also for my information:
Firefox Add Ons
Adblock Plus
Flashblock
NoScript
PDF Download
FireBug

Nightly Tools

Net Usage Item

Open links in Firefox in the background
Type about:config into the address bar in Firefox, then look for the line browser.tabs.loadDivertedInBackground and set it to true.

Automatic Login
su gedit /etc/gdm/custom.conf

And adding this text:
[daemon]
# http://live.gnome.org/GDM/2.22/Configuration
TimedLoginEnable=true
TimedLogin=yourusername
TimedLoginDelay=30

NTFS Drives
Gnome automatically finds and mounts NTFS drives/partitions, however in Fedora 9 and later, ownership is now broken. Each partition (And every sub folder and file) seems to default to ownership so even though some operations work such as moving files around, even adding and deleting, some programs will complain (I found this problem through RapidSVN not working). Nautilus reports that you are not the owner and even if you run Nautilus as root you cannot change to owner to anything other than root. The way I solved this was to install ntfs-config and run with:
sudo ntfs-config

You should now have valid entries in /etc/fstab:
sudo cp /etc/fstab /etc/fstab.bak
gksudo gedit /etc/fstab

Something like this (One for each partition, the ones you are interested in are any with ntfs-3g):
UUID=A2D4DF1DD4DEF291 /media/DUMP ntfs-3g defaults,nosuid,nodev,locale=en_AU.UTF-8 0 0

I then edited each ntfs-3g line like so:
UUID=A2D4DF1DD4DEF291 /media/DUMP ntfs-3g defaults,rw,user,uid=500,gid=500,umask=0022,nosuid,nodev,locale=en_AU.UTF-8 0 0

Where uid=youruserid and gid=yourgroupid. You can find these out via System->Administration->Users and Groups (There is probably a command to find this out, actually I would say there is definitely a command for finding this out, but I’m pretty lazy). If you log in with different users perhaps changing to a common group would be better? Reboot to let these settings take effect. If you now to view your partition in Nautilus, File->Properties->Permissions should now list you as the owner with your group.

You now have a pretty well set up Fedora 10 installation. These steps should be pretty similar for future versions. I will probably refer back these when I install Fedora 11 or 12 in a year or two. I love Fedora because it is the total opposite of Windows. With Vista, Microsoft stagnated, waiting a year or two longer than they should have to release a product that by that time was out of touch with the target audience. In contrast, I had been planning to install Fedora 9 this time around after installing 8 only 6-12 months ago, but I was pleasantly surprised to find that 10 had been released. I would also like to try Ubuntu as I haven’t really used it much apart from at work, so I might give that a shot next time. x86_64 has certainly matured over the last 2 or 3 years, I would say it is definitely production server ready and probably has been for at least a year. The quality and variety of packages available for Linux is amazing, the community just keeps on giving. Fedora just keeps on amazing me. The future is bright, I can’t wait.

Covers!

Last night we went to The Pheonix (Where I met Christina almost 3 years ago!) and they had The Ellis Collective playing with two other bands. The girl on the violin was the highlight, reminding Sam and I of Fourplay. It would have been good if there were more violin solos, also the songs on their site just don’t do the violin justice.
I heart covers. No wait, I heart covers more than everyone in the world put together. YouTube is awesome for finding covers, it can be quite patchy, but basically everyone votes down the worst ones so you only view the gold.

Johnny from Australia.
Kina Grannis

Johnny sings Exit Music (from a Film) by Radiohead (And he does drums as well, awesome (Well not at the same time, but still awesome)).
Kina sings 1234 by Fiest
Johnny sings Waltz #2 by Elliot Smith
Kina sings New Soul by Yael Naïm
Johnny sings Mad World by Gary Jules
Johnny sings Karma Police by Radiohead

Kina sings Hallelujah by Leonard Cohen
Johnny sings Hallelujah by Leonard Cohen
No one is as awesome as Hallelujah Jesus himself though.

I also heart mashups.
Jack Conte specialises in both covering and mashing up songs, some of them are downloadable at.
Jack Conte – Radiohead/Chopin mashup/cover
Jack Conte – Gorillaz – Feel Good Inc. cover watch this if only for the ending hilarious.
Jack Conte – Aphex Twin/Bright Eyes mashup/cover

This wouldn’t be a blog entry by Chris if it didn’t include:
Johnny sings *cough* Britney *cough*
Yael Naïm New Soul and Toxic

const int vs. enum vs. #define

Example A:

int GetValue0()
{
  return 10;
}
 
int GetValue1()
{
  return 10 + 10;
}
 
int GetValue2()
{
  return 10 * 10;
}

So if all of the values 10 represent a common magic number then you are going to want to extract that value to one location instead of the 5 that it is in at the moment. How do we do this?

Say we call the value MAGIC_NUMBER (Of course, in a real life situation you would use a better name than this, wouldn’t you? Something like PI, GST_PERCENTAGE, NUMBER_OF_CLIENTS, etc.) we would then have this code.

Example B:

int GetValue0()
{
  return MAGIC_NUMBER;
}
 
int GetValue1()
{
  return MAGIC_NUMBER + MAGIC_NUMBER;
}
 
int GetValue2()
{
  return MAGIC_NUMBER * MAGIC_NUMBER;
}

Great. The problem now is, how do we tell our program about MAGIC_NUMBER?

If you originally started programming C then your first instinct may be to use:

#define MAGIC_NUMBER 10

The main problem here is that the compiler may not realise that all of the places you use MAGIC_NUMBER are linked so it may not realise that it can factor it out. The other problem with this method is the lack of type safety. You can use MAGIC_NUMBER with sign/unsigned ints, char, bool etc. The compiler may not warn you about converting between these types. For an int this isn’t really a problem, but const float PI = 3.14…f should give you a warning when you try to initialise an int to it. This is good news as it requires you to cast if you really want to do it, which then shows other programmers than you have actually thought about what you are doing and what is happening to the value as it passes through each variable.

You might be tempted to use:

int MAGIC_NUMBER = 10;

This is much better as it adds type safety, however you can do even better than that,

const int MAGIC_NUMBER = 10;

This way it isn’t “just” a global variable, when you declare it as const you are telling the compiler that it will never change in value which means that it can make all sorts of assumptions about how it will be used. Your compiler may or may not factor out/in this value, it may or may not insert 10 + 10 and 10 * 10 into those functions for you at compile time. Using a const int means that the compiler gets the choice of using either the value directly or using a variable containing that value, and I trust the compiler more than myself to make that decision. Because it knows the value of MAGIC_NUMBER at compile time and knows that it will never change at runtime it can actually do the calculation and insert that value instead.

int GetValue0()
{
  return 10;
}
 
int GetValue1()
{
  return 20;
}
 
int GetValue2()
{
  return 100;
}

The other magic number container is enum. It varies slightly to const int as it is more for collections of values where you want to identify something by what type it is.

enum RESULT
{
  RESULT_FILE_DOWNLOADED,
  RESULT_CONNECTION_FAILED, 
  RESULT_FILE_NOT_FOUND,
  RESULT_DISCONNECTED
};
 
RESULT DownloadFile(const std::string& url)
{
  // Pseudocode
  if (could not connect) return RESULT_CONNECTION_FAILED;
  if (file not found) return RESULT_FILE_NOT_FOUND; 
 
  if (disconnected) return RESULT_DICONNECTED;
 
  return RESULT_FILE_DOWNLOADED;
}

In this way we can get rid of magic numbers and (In C++0x at least with enum class) get some type safety, your compiler will hopefully complain if you try and return an integer instead of a RESULT.

Crank very early testing of heightmap and basic physics

This is what I have been working on in my spare time. I think it is going alright. I still have to do:
The brakes can’t be locked up at the moment, they always give a little so you can’t for example lean on the front wheel and hold the brakes and expect to stay in the same place.
Create the level editor, at the moment the level is just a bunch of sin waves added together.
Add a little bit of parallax scrolling to add a bit of interest and depth.
Create better artwork for example shocks and rider.
Add dynamic bike bits such as suspension that compresses and a rider that leans forwards and backwards and has a weight (At the moment I fake weight transfer by just rotating the bike).

Library source
Game source

Call of Duty 4 AI and difficulty levels

I started and finished Call of Duty 4 yesterday. Basically it is war themed FPS that tries to capture the feeling of helplessness of being in a war. The standard method for creating each level in a game like this (Quake 1) is to create the “map” that the player will run around in and then manually add a series of points (Spawn points) in the map that represent each bad guy. So if you add 20 points, you will have 20 bad guys. Another method is to specify 100 spawn points and then spawn 20 bad guys at 20 of those points, which mixes it up a little bit so that the game is less predictable. In earlier games like Quake 1 and Wolfenstein, these bad guys have a problem where they just sit on their starting position until they saw the player, at which point they would basically run straight at the player shooting, pretty simple stuff.

Call of Duty differs slightly from other games:
a) I think there are actually fewer spawn points than in a conventional game, however,
b) There is not a 1 to 1 mapping of bad guys to spawn points, bad guys constantly stream out of these spawn points.
c) These spawn points are placed slightly off the player accessible part of the map, and then the enemies jump over walls, emerge from doors and alleyways, helicopter and rapel in, etc.

There are also the usual predictable, “There’ll be a guy behind the door here, a guy will run through now”, but it’s much less noticable. Each spawn just continuously spawns bad guys until you either get to a certain pointon the map at which point the next spawn point will start spawning, or a time constrain runs out (“Defend this point for 2 minutes”).

All of the Call of Duty games (And most FPS games in general) have a configurable difficulty system consisting of something like “Easy”, “Medium”, “Hard”, “Insane”. The problem is that it is usually set for the whole game so you choose medium and go to the first level which introduces you to the game, then the levels get progressively harder, now the problem is that the first level isn’t a real representation of the difficulty. By the middle of the game it can feel way to hard or way to easy. Grand Theft Auto solves this problem by having every player play the same difficulty and then the missions get progressively harder until I give up and don’t finish the game, so that solution isn’t without its problems. It would be much better if every game were Grand Theft Auto style, but were the game adapted to how good the player was. Some games are already similar to this, but instead of lowering/raising the difficulty of the enemies they give you more or less health and ammunition. However it could be possible to exploit this method by playing badly until the last level and then easily beat the last few levels and completing the game. The other problem is how to differentiate between a good and bad player. Each player would have a few areas where they are judged, such as health at the end of each level, health lost per enemy encountered (A ratio something like 15% health lost per enemy killed), speed through the level, accuracy with each weapon/speed. You could even add other things such as exploration of the level, stealth and variety of weapons/styles.

It’s hard to tell the difference between whether a player is going slow because they aren’t competent enough or if they are just taking their time and being methodical for example. The other problem is how do tell the player’s speed through the game? The easiest (And roughest) way is to get their total time through the level from entry to exit. You could also have points along the typical player’s route and time the player between them, or find the average time between the player seeing each enemy and killing them, or the total kills divided by the total level time. You could even have an experienced player play through the game, then a less experienced player, recording both their play and then scaling the difficulty based on which one the player is playing most like.

Another interesting method would be to identify a few main playing styles such as bunnyhopping, rushing, camping. You could then spawn more or less enemies and at different distances from the player to force the player into the mode you want them to be in. For example if they are rushing, spawn more enemies behind the player or above the player to force them to take their time and look around a bit more. If the player is taking too much time and sniping too much, spawn some enemies just around the corner who will rush the player and force her to switch to a short range weapon.

The first mission of Call of Duty 4 is the unskippable training mission for people who have never played an FPS before. “Here’s a gun, here’s a target, here’s a grenade, etc.”. It takes about 10 minutes and should be skippable considering that almost everyone who plays will have played an FPS previously and a lot of them would have actually played previous Call of Duty games. It is a prime candidate for starting with a message box that says “Do you want to try a training mission before you get stuck in?”. Another option would be to have everyone go straight to the first real mission and either start real slow, with a run through the woods or something with lots of running and one enemy every minute and then gradually introducing the player to other weapons and skills (“Here’s a grenade, throw it with this button…”, “Crouch behind this wall so that the guard can’t see you”).

Anyway, it’s a good game, but doesn’t really do anything the previous versions didn’t do. The highlight for me was running around in Ghillie hiding awesomely. It would have been better if there were a lot more Ghillie suit missions. Being the gunner for the DC aircraft was fun too but went on way too long.