Running Marketscape Experiments



Once the experiment is set up, log onto eeps4.caltech.edu. Using an ssh client is recommended. For a windows ssh client go to http://www.openssh.com/windows.html. My personal favorite is putty (http://www.chiark.greenend.org.uk/~sgtatham/putty/).


Understanding and modifying the parameters files:


Understanding the parameter files is important if you want to check a setup, debug an experiment if something goes wrong, copy a parameter file to a new experiment (an easy way to duplicate or modify an existing experiment).


The parameter files are located in $HOME/setup/param (i.e. you can get to the directory by typing: cd setup/param). The filenames are what you entered when you originally created the experiment, either by hand or through the web interface.


If you are planning to run an experiment again, with or without modifications, you should copy the param file to a new one. Ex: cp 001129-21:00:00 001206-19:30:00

to copy the November 29th 9pm experiment to one that will run on December 6th at 7:30pm. Note that the file names start with a 2-digit year then the month, the day, followed by a time in military 24-hour format.


Now you will need to edit the new param file. Eeps4 does have vi/vim and emacs installed. If you are unfamiliar with these text editors investing a few minutes in looking at the beginning of the man page for one of these editors would be a good idea (try: man vim), they will have a short "getting started" section.


You will need to carefully check and alter the first 19 lines of the file. These control the ports, experiment name, location on the computer and url. You will need to pick a new base port the same way discussed earlier for new experiments. The other ports are just the baseport with different values in the least two significant places (i.e. base port mod 100 =0, other ports equal base port+x where x < 100). Anyway, leave the last two digits alone, but do change the rest to the new base port.


Next, search for "outputdir" (in vi/vim "/outputdir" will find it), and replace it with a new output directory name, again just use common sense in your naming. Below that is the paramfile setting. Just change that to the name of the new parameter file.


The last change you have to make is the starttime setting. This is a 32-bit integer, indicating a number of seconds since 00:00:00 Jan 1 1970 UTC (approximately the same as Greenwich mean time). This is the way linux keeps track of time internally. You will need to change this to the new start time. The easiest way is simply add or subtract the number of seconds between the original setting and the run time of the new experiment. If you want another method, I've written a program which you can run by typing ~ec160/bin/h2lin_time (human to linux time). You might want to run it twice, just to make sure you didn't make a mistake.


After all those changes are made, go back and alter any of the experiment's market parameters that need changing. The fields should be pretty obvious, the web form mimics the param file fairly closely.


Finally once that's done, save the file and continue as though you have just completed the web forms. Go back to your homedir (run: cd ~ ( or just cd with no parameters)). Then go to exp and run ./inst.


Useful Parameter Changes for During the actual Experiment:


(All paths in this section are relative to your experiment's path - probably something like /home/ec160/exp/001129/)


One major problem we had was a really slow refresh of the private markets. It's a good idea to prepare for more subjects than you expect, but this can hurt the actual performance at run-time. Decreasing the number of private markets the server manages is an easy way to fix this problem. Once the market is created, there will be a number of files in the market/param/priv##.dat (ex: priv1.dat should always be there). Edit each of the priv#.dat files. To each you want to leave the first n lines, where n is the number of private markets you want available. (I would suggest copying each to a backup first, in case more people show than expected.) Simply cp priv1.dat priv1.orig, etc. A quick easy way to prune these files: head -n priv#.dat > tmp; mv tmp priv#.dat. For example, say you created an experiment and allowed for 100 subjects, and only 6 show, so you want to prune it down to 10 private markets (in case some more show up). You would run something like this for each file: head -10 priv1.dat > tmp; mv tmp priv1.dat


More sophisticated alterations are probably a bad idea unless you really know what's going on. Most of the other alterations will involve modifying web pages and messages to pass information to the subjects.


Monitoring the Experiment:


/var/log/httpd contains the logs from the web server. If you want to watch for people accessing your experiment you might want to leave a window open, cd /var/log/httpd and then run tail -f access.log | grep (enter your market url, as entered in the param file)

ex: tail -f access.log | grep market-001129


This will enable you to watch everyone who accesses the market. It will also watch any markets with the same root name. If you find this to be a problem, you can add additional greps to filter out lines with a given string by using the -v parameter.

ex: Tail -f access.log | grep market-001129 | grep -v market-001129ryr. You may find it useful to string quite a number of these greps to filter to suit your needs.


Another file you might want to monitor is the registration logs. You can watch as people sign in with another tail -f. The file is in your experiment directory in data/reg. The filename is "log" So for example you might want to run: tail -f home/ec160/exp/001129/data/reg/log.


If you want to see how many subjects are currently logged in not including spectators logged in as 1 use: wc -l /home/ec160/exp/001129/data/reg.log


This file is also particularly useful at the end of the experiment, when you need to correlate subject ids to people. It also happens to contain all the information you will need to send them their checks.


While the experiment is running, you may wish to monitor a subject's behavior. The easiest way to inobtrusively monitor participants is to log (using their ID number and the default password). You should refresh frequently to watch what trades they are making. We found this somewhat interesting.

Another piece of information you might need while the experiment is running are the passwords that were assigned to participants as they logged in. The password listing is in the file data/passwords. An easy way to lookup a user's password (by subject id) is to grep for it. For example, if we are running market-001129 and I want to know the password of user 103: grep 103 /home/ec160/exp/001129/data/passwords.


An id can be logged in from multiple computers, this may be a security hole, but you will see the advantages of this during run-time.


Good luck, and have fun