Adam McD
United States Minnesota

.
Lately I have been thinking a lot about a board game mechanism for earning money that I thought up. I'd like to hear what y'all think about it.
How the mechanism works:
Pretend it is a moneyearning round for all players. Each player chooses a monetary value in secret between $0.00 and $100.00 (including '$100.00'), call this value V. Once all players have chosen a value, each player gets paid $V, plus they get paid $V for each other player that chose a higher value than they did (ties do not count for this).
Here is an example:
Suppose the players chose the following values...
P1: $10.00 P2: $20.00 P3: $30.00 P4: $40.00 P5: $50.01 P6: $60.00 P7: $70.20 P8: $80.00 P9: $91.23 P10: $10.00
Then the players would win the following amounts (notice that for an exact tie [P1 & P10], the other player does not earn you money)...
P1: $90.00 (= $10.00 + 8*$10.00) P2: $160.00 (= $20.00 + 7*$20.00) P3: $210.00 (= $30.00 + 6*$30.00) P4: $240.00 (= $40.00 + 5*$40.00) P5: $250.05 (= $50.01 + 4*$50.01) P6: $240.00 (= $60.00 + 3*$60.00) P7: $210.60 (= $70.20 + 2*$70.20) P8: $160.00 (= $80.00 + 1*$80.00) P9: $91.23 (= $91.23 + 0*$91.23) P10: $90.00 (= $10.00 + 8*$10.00)
My comments:
If a player only needed $100 in order to somehow win in the next round, then he/she could just select '$100.00' and not worry about what other players are picking. If this type of selection were common, some players might pick a number like '$98.00' more often, in hopes of getting 34 times this amount.
There seem to be safe bets & risky bets  but I am not sure what the best strategy is. It seems like it would be an interesting mechanism to use during a 'moneyearning' round in a board game. Does anyone else have any thoughts? Also, does anyone know of games that use a mechanism similar to this?
FYI: I am running a contest, HERE, to test this mechanism out on BGGusers. I will post the results there when it is done.



(its 8am in nz so excuse lack of logic at times)
First problem I see is, most board games only have 26 players.
So unless this is something like "werewolf"/"Mafia" or similiar with a lot more players then theres a minor problem.
To fix that problem you'd either need a smaller pool of values, or make sure there is a lot of players.
Also I would suspect either each turn the players would tend to pick the same number or it would become reasonably obvious what a good strategy is going to be over time.
I suspect this probably has a "max/min" solution, and I would think picking $50 is probably close to being a good general solution.

Adam McD
United States Minnesota

Thanks for your input j_holmes, it's good to hear a view that is very different from my own.
j_holmes wrote: (its 8am in nz so excuse lack of logic at times)
First problem I see is, most board games only have 26 players.
I use used [player] = 10 and [Max $] = 100.00 as an example. These could be changed to suit whatever one needed without affecting much (at long as the values stayed at reasonable and proportional numbers). No big deal there.
j_holmes wrote: Also I would suspect either each turn the players would tend to pick the same number or it would become reasonably obvious what a good strategy is going to be over time.
This would surprise me. If I know what the other players are going to pick, I should be able to pick just under one of the middlemost picks so that I have a high value myself, and a large 'multiplier' as well.
If you overpicked everyone else, I think you would be encouraged to try and pick a lower value. On the other hand, the lowest pick might stay where he/she is at if they won the most amount of money last time. If they chose too low and ten times their value was not a competitively high winning compared to others, then you would be encouraged to pick higher. But who knows...[/q]
j_holmes wrote: I suspect this probably has a "max/min" solution, and I would think picking $50 is probably close to being a good general solution.
I ran some 'tests' of my own. I started with random numbers, then I 'improved' each players guess in successive rounds. Initially, the mid tolowvalued guessers would win, so they guessed the same, while highervalued guessers would lose (due to a low 'multiplier') so they would guess lower. This train of thought drove the average guess down over time into the '20's/'30's range, where it stabilized. But that is very math gametheoryesque, and may not apply to reallife.
I also wonder about what I said in my original post. What if your goal isn't only to maximize your money you earn? What if you really just need a guaranteed $94 for next round to achieve some goal or build something you want ASAP? You could guess '$50' and hope that you are not the highest bidder, or you could play it 'safe' and guess $94 on the nose. Is this really playing it safe though? Perhaps 1 or 2 other players are picking near $100 for similar reasons... perhaps then you will be getting 3*$94 instead of just $94. Hmm...

¡dn ʇǝƃ ʇ,uɐɔ ı puɐ uǝllɐɟ ǝʌ,ı
Canada Chestermere Alberta
Life lesson: Hamsters are NOT diswasher safe.
There are 10 types of people those who understand binary, and those who don't.

My concern with the idea is that it requires a) a lot of thinking and planning, and b) a lot of math.
I play most games with this one guy, in particular, who won't make a move until he's gone over all of the alternatives in his head. This is the kind of mechanism that would drive me batty, while I wait for him to second, third, and fourthguess himself.
The math makes perfect sense, in your example, and I realize that you'd scale it down to five or six players, but if people have to start multiplying figures, they're going to rebel. Especially if you left those decimal points in there. $9.21 times 8? No one wants to try to get that right. You're going to need a chart with income on one leg and # of players on the other, so that people can crossreference fast and without doing math.
My other thought is that if you wanted to eliminate the high bids ($98, $99, $100), although I have no testing to prove that it might work, you could penalize the top bidder only. So that if I did bid $100, I would automatically lose half, or all, or whatever. Or base the multiplied value on the spread between the lowest and highest bids. If someone bids $10 and another one bids $90, there's $80 between highest and lowest. Multiply the number of people who bid higher than you did by $80. Lowest bid gets the highest # to multiply with, highest bid gets $0, since no one bid higher $80 x 0 players = $0.

B C Z
United States Reston Virginia

I had a really nice post eaten by lag.
It summed up as this:
With flat distribution across 100 values (1100), 50/51 are the best choices, and 50 is better because it'll catch more 'above it'.
Indeed, any given pair of n(101n) will favor the lower value.
There is no reason to go for a higher base value without an outside impotus beyond 'maximize return'.
. 1 will yield at most the number of players. 100 will yield at most 100.
Where p = # of players, px+1 will always yield more than x, where x < p and px+1 < max
AND
Maximum possible yield is (max1)(p1) if everyone stupidly picks max except one person who is one allowable unit below.

Adam McD
United States Minnesota

Thanks for the comments.
MABBY wrote: My concern with the idea is that it requires a) a lot of thinking and planning, and b) a lot of math.
I play most games with this one guy, in particular, who won't make a move until he's gone over all of the alternatives in his head. This is the kind of mechanism that would drive me batty, while I wait for him to second, third, and fourthguess himself. Ack! I hate it when people take long turns. I would have to change the game so that players choose numbers from 1 to 10 (or 1 to 20 something like this), so that there is less to think about. This is more reasonable anyhow.
MABBY wrote: The math makes perfect sense, in your example, and I realize that you'd scale it down to five or six players, but if people have to start multiplying figures, they're going to rebel. Especially if you left those decimal points in there. $9.21 times 8? No one wants to try to get that right. You're going to need a chart with income on one leg and # of players on the other, so that people can crossreference fast and without doing math.
Choosing from 1 to 10 (or 1 to 20) instead of cents would also fix this. I was just using $'s and ¢'s because I was thinking of it in a TVgameshow setting at first. Lame, I know.
MABBY wrote: My other thought is that if you wanted to eliminate the high bids ($98, $99, $100), although I have no testing to prove that it might work, you could penalize the top bidder only. So that if I did bid $100, I would automatically lose half, or all, or whatever. Or base the multiplied value on the spread between the lowest and highest bids. If someone bids $10 and another one bids $90, there's $80 between highest and lowest. Multiply the number of people who bid higher than you did by $80. Lowest bid gets the highest # to multiply with, highest bid gets $0, since no one bid higher $80 x 0 players = $0. Hmm... I actually wasn't concerned about high bids (should I be!?). I was simply wondering if high bids would occur sometimes  it might actually be a good thing, because I think there is mostly pressure to undercut others (hence pressure to lower your bid). But maybe I'm missing something...

B C Z
United States Reston Virginia

Thought experiment:
Allowable values: 1 Everyone bids the same (1) Everyone gets the same (1)
Allowable values: 12 Bidding high gets you at most 2 (noone is higher) Bidding low gets you 1 + number of players bidding high 0 other bidders bidding high: everyone gets 1 1 other bidder bidding high: everyone gets 2 2 other bidders bidding high: 'high' gets 2, everyone else gets 3 n other bidders bidding high: 'high' gets 2, everyone else gets n+1
Allowable values: 13 bid of 3 gets at most 3 bid of 2 gets 2,4,6,8... if 0,1,2,3 other players bid 3 bid of 1 gets 1,2,3,4... if 0,1,2,3 other players bid 2 or 3
Number of players is the key variable in this equation.

Adam McD
United States Minnesota

byronczimmer wrote: Thought experiment:
Allowable values: 1 Everyone bids the same (1) Everyone gets the same (1)
Allowable values: 12 Bidding high gets you at most 2 (noone is higher) Bidding low gets you 1 + number of players bidding high 0 other bidders bidding high: everyone gets 1 1 other bidder bidding high: everyone gets 2 2 other bidders bidding high: 'high' gets 2, everyone else gets 3 n other bidders bidding high: 'high' gets 2, everyone else gets n+1
Allowable values: 13 bid of 3 gets at most 3 bid of 2 gets 2,4,6,8... if 0,1,2,3 other players bid 3 bid of 1 gets 1,2,3,4... if 0,1,2,3 other players bid 2 or 3
Number of players is the key variable in this equation.
Or, we could fix the bid_max at 100 (min at 0), and slowly increase the # of players...
1 player: To bid 100 is best. Obvious.
2 players: Bidding below 50 seems senseless as you would then win less than 100 (and you could get 100 just by bidding '100'). If both players realize this fact, then perhaps both would decide to bid somewhere between 50 and 100 (including '50' & '100' as possibilities). But if both players bid between 50 & 100, then the lower bidding player will win. So, one should bid on the low end (perhaps bid exactly 50?). However, if you expected your opponent to bid 50 or higher, but probably on the low end, then you could secure a win by guessing '49'. But if your opponent knows you plan to outmaneuver him/her by a slight underbid (one that is less than 50), then he/she could just up his/her bid back up to 100 to ensure a win...
I was surprised to find this sort of circular PrincessBrideesque logic to already occur at the two player level. I'm not sure what the best answer is here, but it seems like '50' is a good guess since it ties or beats 76 other possible bids ('1' through '25' and '50' through '100'). We could look at this problem as maximizing the number of opponent pick combinations that allow you to win...



Its not so much what happens with 2 players, its more what happens with 4 or 5 players.
As someone else points out theres also a bit of work to then determine how much everyone gets paid out.
I do thnk this idea might have some merit, but I think you really need to clarify if its for 24 player games, 36 player games, or Werewolf/Mafia style 520+ player games.

Jack Neal
United States Liverpool New York

Can you bring the range down to $110, whole numbers? That simplifies the math down considerably and makes it easier to figure out the permutations without adding a lot of precision.
Just my $0.02 worth of a bid.

B C Z
United States Reston Virginia

In reality, the fidelity and number of possible choices doesn't matter a lick.
What matters is the range as it relates to the number of players and, probably, how many times they've played against each other.

Jack Neal
United States Liverpool New York

Well, if that's the case, bring the range down to the number of players x 2.

Adam McD
United States Minnesota

Raiderjakk wrote: Can you bring the range down to $110, whole numbers? That simplifies the math down considerably and makes it easier to figure out the permutations without adding a lot of precision. Just my $0.02 worth of a bid. Yes  this is what I suggested before. As I said, I was in TVgameshow mode when I was thinking of the $0.00 dollars to $100.00 dollars range...

Jack Neal
United States Liverpool New York

AdamMcD wrote: Yes  this is what I suggested before. As I said, I was in TVgameshow mode when I was thinking of the $0.00 dollars to $100.00 dollars range...
Sorry about that  I obviously missed it.
Interesting idea though....

Adam McD
United States Minnesota

AdamMcD wrote: Or, we could fix the bid_max at 100 (min at 0), and slowly increase the # of players... I thought more about this. Actually, I did more than just think. Here's what I found...
(Warning: If you are not a mathgeek, you may find the following material to be boring)
let's put the max bid at 10 instead of 100, so only there are only ten possible bids: 1,2,3, ... ,10. We'll keep the # of players at 2.
Let's compare a bid of X to a bid of Y, for all possible X and Y in the set {1,2,...,10}.
I will use the word "total" to mean how much money you earn from your bid, plus your bid amount a second time, provided your opponent bid strictly higher than you. E.G. If you bid 3 and your opponent bids 7, your total is $6 and your opponents total is $7.
I will add in a "1" for every win (when your total is higher than your opponents), a "0" for a loss, and a "1/2" for a tie (I will later come to realize that granting "1/2" for a tie may have been a mistake).
Let's now compare all X,Y combos like we said we would do:
[Your bid (X)], [Average # wins out of 10 (Of the 10 possible Y's, add 1 for each win and 1/2 for each tie)]
$1, 1 = 0 + 2*(1/2) $2, 2.5 = 1 + 3*(1/2) $3, 4 = 3 + 2*(1/2) $4, 5.5 = 4 + 3*(1/2) $5, 7 = 6 + 2*(1/2) $6, 7 = 6 + 2*(1/2) $7, 6.5 = 6 + 1*(1/2) $8, 6 = 5 + 2*(1/2) $9, 5.5 = 5 + 1*(1/2) $10, 5 = 4 + 2*(1/2)
The above would seem to indicate that $5 or $6 is the "best" bid. Does this mean we should always bid $5 or $6? No.
If you always bid $5 or $6, then the opponent will likely bid a $4 quite often. And if $4, $5 & $6 are the only three bids ever made, then $3 would be a great choice, since it beats or ties all of these. And so on...
Consider how a player should never settle on 1 or 2 choices in rock/paper/scissors. For example, if you almost always choose rock or paper, and almost never paper, then you opponent will likely choose paper very often. The game of R/P/S is perfectly symmetrical, and indeed the optimal strategy is to try and pick each one of the three around 33% of the time (unless you can capitalize on an opponent who foolishly does not pick with a 33%/33%/33% distribution).
Similar to rock/paper/scissors, players should choose a bid of $1, $2, $3, ... , $10 with certain probabilities, in order to keep their opponent from outguessing you.
Basically, what I did earlier was start with the assumption that all bids $1 through $10 were equally probable (or equally good), and used that to find a better distribution of bids.
How do you get these percentages? Check it out, before I got this:
$1, 1 = 0 + 2*(1/2) $2, 2.5 = 1 + 3*(1/2) $3, 4 = 3 + 2*(1/2) $4, 5.5 = 4 + 3*(1/2) $5, 7 = 6 + 2*(1/2) $6, 7 = 6 + 2*(1/2) $7, 6.5 = 6 + 1*(1/2) $8, 6 = 5 + 2*(1/2) $9, 5.5 = 5 + 1*(1/2) $10, 5 = 4 + 2*(1/2)
So now I take 1 + 2.5 + 4 + 5.5 + 7 + 7 + 6.5 + 6 + 5.5 + 5 = 50.
I did not just get "50" randomly. This number 50 represents the 50 ways that player X can win over player Y (counting ties as '1/2', for each player) if we consider each possible (X,Y) combo for X's and Y's ranging from $1 to $10 (there are 100 such combos, which makes sense considering player X and player Y each win 50 of the 100, assuming everything starts with an even distribution).
So, according to the above ("better") chart, what % of the time should I bid a $2? Just take 2.5/total = 5/50 = 0.05 = 5%. So don't bid $2 very often...
How about a $5 or a $6? Take 6.5/50 = 0.13 = 13%. So apparently we should be bidding a $5 or a $6 26% (= 13% + 13%) of the time.
But I digress. The above table is just the 2nd generation calculation of obtaining a better bid distribution. Creating a 10x10 matrix and taking the Nth power will lead one to the Nth generational table of suggested bid distributions. Higher powers give "better" results, but this process converges rather quickly. Take a look:
Bid G1% G2% ... G11% ... G22% ... G66%
$1 10 2 1 1/2 1/2 $2 10 3 4 5 5 $3 10 8 10 10 10 $4 10 11 13 13 13 $5 10 14 15 15 15 $6 10 14 14 14 14 $7 10 13 12 12 12 $8 10 12 11 11 11 $9 10 11 10 10 10 $10 10 10 10 10 10
The % of the time that this table tells a player to bid $1 settles to about 0.5%. This should be 0%, not 1/2% (because you can never win by bidding $1). The reason it does not go to 0% has to do with the fact that I let a tie be worth '1/2' initially. This means that two ties are just as good as one win in the eyes of this process. So, even though $1 gets no wins, it could eventually get a couple of ties (assuming the other player plays with the same strategy), which is just as good as a win, doh!
I'd be willing to bet dollars to donuts that if I played a large number of matches using my strategy from the G22/G66 column, that unless my opponent used the same strategy (or a very similar one) to me, that if my opponent use the uniform distribution strategy, I would win more battles than my opponent. In fact, I would probably bet a decent sum of money on this ( right after I check my arithmetic... ).
Oh, and since that column adds up to 100.5%, I would just drop the '1/2' and bid $1 zero percent of the time. No problem.
[Edit: I should note that this strategy I came up with is not optimal. It simply is designed to beat the 10%/10%/.../10% strategy (and it does... I checked). Playing 50% $5's and 50% $6's, for example, will usually beat the strat developed here. This one just seems to be an equilibrium of some kind. If someone knows more gametheory or linear algebra than I do and would care to comment, please do]
Sorry to be so lengthy! I hope at least one person finds this post interesting.

Wim van Gruisen
Netherlands Den Bosch Unspecified

I tried things out in Excel. Three players, range 110. First bid is random, after that a strategy was formulated. If you bid highest in the last round, you bid one less now. Else, if you gained less than the average, you bid one more now. Else, you bid what you bid last time.
It was a race to the bottom. First, values converge (what I expected). Then, because of the way I formulated the spreadsheet, when the values became equal, or when the two highest values were defined as equal, they were considered highest, and thus bid one less the next round.
Question to the readers: can you devise a better strategy? One that can be defined in one formula in Excel?7
What I have now, the variables that you can work with: * Three players, variables between 1 and 10. * First three lines (24) give the value of the bet. This is where I enter your formula.
* Lines 79 determine how your bid relates to the others. It determines the multiplier. A value of 1 indicates that nobody bid higher than you, 2 that only one person bid higher than you, 3 that both other players bid higher. Formula for C7: =COUNTIF(C$2:C$4;">"&C2)+1
* Lines 1214 give your earnings this round. This is done by multiplying your bid with the value of lines 79 Formula for C12: =C2*C7
* Line 15 gives the average of those three earnings. Formula: =SUM(C12:C14)/3
* Lines 1820 give the cumulative earnings.Formula for C18: =C12+B18
The formula that I used, for the player in line 2: =IF(C7=1;C21;IF(C12

Lorenzo Carnevale
Italy Roma

It is quite interesting. It needs to be worked on to create a fun mechanic and not an headache brainburner.
Variants that come to my mind: 1) The first/active player must declare aloud his choice before other players bid (secretly). 2) The first player is forced to choose a particular role. 3) The bid is done via card drafting. 4) Not all values in the range are available. 5) The "prize" is not necessarily (your bid)*(# of players who bid higher)

Wim van Gruisen
Netherlands Den Bosch Unspecified

Solaris wrote: It is quite interesting. It needs to be worked on to create a fun mechanic and not an headache brainburner.
Meh. One person's headache brainburner is another one's fun mechanic.



I love the math. Thanks for doing it!



I'm not strickly reminded of this example but what this mechanic your suggesting of reminds me off an 'economics experiment' which runs like the following.
"If there is ever no money in this bowl the game ends." "If there is money left over in this bowl I'll add 5 dollars." "There are four of you playing, in order for me to add money to the bowl each of you has to take money from the bowl. "The bowl starts with 5 dollars in it."
On the first time people play this game, if there are no turns every one grabs for the money. On the first time people play this game, if there are turns often time one of the players in the first "round" goes and takes all the money.
Almost always, if the first player takes a bit of time to think, or/and when talking occurs someone quickly realized how easy it is to ensure that everyone can make a great sum of money off the person running the experiment.
Now, any rational person wouldn't put themselves in such a position. But hey, Economists are interested in seeing how fast other get rational. I like your mechanic because it's kinda like trying to make money off the other players...
... the question I have is where does the money come from? Are the players paying each other? Is there a cap on the money, does turn order conflict with that cap?

Adam McD
United States Minnesota

PGames_DM wrote: ... the question I have is where does the money come from? Are the players paying each other? Is there a cap on the money, does turn order conflict with that cap? One thought of mine would be for this mechanism be one of the main ways for players to earn money (from 'the bank'). Let's say it's a 36 player game. Have the 'earning' round be one where players choose a number between 1 and 10, and they get paid according to this mechanism. A good take would depend on the # of players, but I'd think that $15$20 would be a good total to have.
What's good about using #'s 110, is that players should be able to calculator their winnings easily (w/o a calculator).
Other things might factor in as well. If some sort of 'building' round follows the money round, and you are $10 short of building something awesome, perhaps you will just bid '10' to be safe. On the other hand, if someone needs $15 to buy something to win the game, and you want to slow him/her down, then perhaps you would bid really low (like a '2' or '3') just to make sure you're putting little or no money in that player's pocket.

Michael Clark
United Kingdom Bucks

I think this is a very interesting mechanic. Enough maths for players to discern some general principles about maximising their income but with enough chaos to allow somebody to take a punt and strike gold once in a while. But will it just be a mathematic game or can you fit the mechanic to a theme? The game show idea you mentioned sounds a bit artificial. How about this?
You're a merchant on an ancient trade route between Europe and the Far East, selling your exotic goods to European traders. The further away from Europe you are, the greater the value of the goods and the more money you can charge to the traders who come your way. None of the merchants are sought after more than the others and every year an equal number of traders will come and visit each one specifically. The traders will come and buy the goods of the merchant they are visiting but, of course, on the way there they will pass by the merchants at the earlier points on the route and will take the opportunity to buy goods from them too. In this way, merchants at the start of the route will get more customers but won't be able to charge much and merchants at the end of the route will get fewer customers who will pay a lot.
I think that's isomorphic to your mechanic, and a little more colourful. I'm not sure what else these merchants are doing during the game, though!

Wyckyd
Netherlands Groningen Groningen

Quote: It was a race to the bottom.
That's what I expected as well. Mainly because of what was said in the starting post:
Quote: P1: $90.00 (= $10.00 + 8*$10.00) P2: $160.00 (= $20.00 + 7*$20.00) P3: $210.00 (= $30.00 + 6*$30.00) P4: $240.00 (= $40.00 + 5*$40.00) P5: $250.05 (= $50.01 + 4*$50.01) P6: $240.00 (= $60.00 + 3*$60.00) P7: $210.60 (= $70.20 + 2*$70.20) P8: $160.00 (= $80.00 + 1*$80.00) P9: $91.23 (= $91.23 + 0*$91.23) P10: $90.00 (= $10.00 + 8*$10.00)
The middle values have the highest expected values. This means that the optimal amount to bid is going to be somewhere around 50. Because everybody knows this, the optimal amount is going to be just below 50, and so on. A race to the bottom. So at what point would you gain more points by just accepting that your bid is going to be the highest? At the point where your gain is more than you are giving away. So, your 100 should be more than the average bid times the number of other players. Looking at a 10 player game, the bidding should have dropped to about 11.
That cuts out a lot of the bidding space.
I think it would work better, if the bonus for players bidding higher would a set amount. You just get 10.00 for each player that bids more. That way, bidding 60 when somebody else bid 50 will just let them catch up to you, instead of sprinting past.
And, people would really look hard at what the minimum is they would need this turn. If you only need $33, go for it. It'll probably net you an additional $90, and you won't give money away to anybody. Unless someone figured they could get by with even less this turn.
(Cool idea btw!)

Wim van Gruisen
Netherlands Den Bosch Unspecified

Wyckyd wrote: Quote: It was a race to the bottom. That's what I expected as well. Yeah, but in my case it was because of the way I formulated the strategy. Especially this part:
Quote: When your bid is the highest in round X, bid one lower in round X+1. This strategy leads to the highest bidder bidding one lower every round. At a certain point he gets equal with the next highest bidder. However, the way I wrote the formula, if you bid the same as another player, and nobody bids more, both bids are considered as 'highest', and both players bid one less the next time. At a certain moment they bid the same as the next highest bidder, and suddenly that players' bids are also considered the highest. And so he will also lower his bids.
It was a stupid mechanical issue that drove the bids down. In real life, with three persons, people would stop at bidding four, I think, and not go lower (if you bid three, the most you can win is nine, and you're better off by bidding ten). The formula that I penned down was not as smart as a normal person, though.

Wim van Gruisen
Netherlands Den Bosch Unspecified

Wyckyd wrote: I think it would work better, if the bonus for players bidding higher would a set amount. You just get 10.00 for each player that bids more. Another way would be to play with the multiplier. In the current game, with ten people, nobody would bid less than $10; since the maximum multiplier is 10, if you bid 9, the most that you could make is $90, so you'd be better off bidding 100. In general, the lower treshold is (highest possible bid)/(number of players).
Say that the multiplier works differently. Instead of the number of people that bid higher, it is (the number of people that bid higher)(half the number of players). With a minimum of 1  no negative income.
Say that player 1 bids 10, player 2 bids 20, and so on until player 10, who bids 100. With the rule proposed above, players 6 to 10 just get what they bid, since there are fewer than five people above them. Player 5 gets twice his bid, player 4 thrice his bid, and so on.
With such a system, it isn't worthwhile to bid one less than the leader; your scores don't get doubled. If you want to bid high, you might just as well bid the highest anyway.
But if everyone bids 100, it becomes lucrative to bid 99 :)
I think (but it's an assumption that needs to be researched) that with this mechanism, bidding high is still a valid strategy. Because you run a serious risk at getting a multiplier of not more than 1, you might as well bid high and to gather all the money.


