Tuesday, September 2, 2014

Are ODI Scores Increasing?

I had a conversation with a sports blogger, John Rogers, on Twitter last week. John Rogers had tweeted a link to a blog post he had written on why the WASP projection being used in BSkyB's coverage of limited overs cricket this English summer is necessarily inaccurate. His point is that ODI cricket is evolving quickly, both in the equipment and the style of batting, so that historical data is a poor guide to how many runs you can expect a team to score.

There is always a tradeoff in statistical work between using only the most recent data to capture trends, and using a longer time period to get more statistical significance. Now, in principle, since WASP is calibrated to a par score set by the broadcast commentators, any trend in scoring that has occurred within the period of the data used to estimate the model could be adjusted for in the par score. The setting of a par score is both a strength and weakness of WASP. The strength is that it allows game-specific information to be factored into the projections such as using local knowledge to assess how the pitch is likely to play. The weakness, however, is that the commentators might suffer from the common human biases of seeing patterns in essentially random data, and I wonder if the view that batting power is increasing is an example of that.

So I was interested to see if John's perception of a recent increase in scoring rates due to teams having more "lower-order hitters", better bats, etc. is borne out in the data. There is no doubt that there has been an increase in scoring over time. For example, all of the 16 ODI matches (all involving top-8 countries) where the team batting second has scored 330 or more have occurred this century. Only 5 of those 16, however, occurred this decade, suggesting that maybe the changes are not so recent.

Extreme scores like these are not necessarily indicative of a general trend, so some regression analysis is called for. John's hypothesis seems to be mainly based on increased rates of scoring by lower-order power hitters near the end of the innings. I don't have the full ball-by-ball database to hand, just a record of scores and results, but if the theory is correct, it should show up in total scores. Now WASP is currently based on ODI data from 2006 involving the top-8 teams, so I had a look at all non-rain-shortened games involving those teams from May 1 2006, using a dummy variable for each year starting May 1. First, I looked at the evolution of first innings scores over that time. To control for different abilities across countries, I ran an OLS regression of first-innings score on dummy variables for the team batting first and for the team bowling first, as well as a dummy variable for each of the 8 years in the database. To further control for differences across grounds, I restricted the data set to games played at grounds where there were at least 10 matches played in this period, and included a dummy variable for each ground. This left me with 245 games. The results are shown in by the blue line in the graph below, with the line showing (left axis) the average first innings score for the average team against the average team at the average ground. There clearly has been very little change over these 8 years.

John's blog post, however, seemed to refer specifically to the ability of teams to chase down large scores, so I separately looked at whether there has been a change in the the probability of the team batting second winning using a probit regression. Because differences in grounds largely affect ease of scoring in both innings, and because probabilistic models require more data to get precise estimates, I used the full dataset without dummy variables for the ground, but again controlled for team ability and included dummy variables for each year. The results are shown in red on the same graph (right axis). Probabilistic models typically require a lot more data, and so I wouldn't put too much faith in the estimates for any one year. But there doesn't seem to be a clear recent trend to it being easier to chase down scores than in previous years, although there was a strange dip in the period 2007-2009 that has since been reversed.

I suspect what is happening is a perception bias. There probably has been a recent increase in power hitting as a result of batsmen taking more risks, but that has been balanced by an increase in the rate of dismissals. And this leads to the reality being different from common perceptions. 20-20 has conditioned us to thinking that it is easy to score 8-9 runs an over on small grounds with flattish pitches. And it is. But it requires aerial shots, unlike 5-6 an over, which can be achieved entirely along the ground with 1s and 2s and the occasional bad ball cut or driven for 4 along the ground too fast for the cover sweeper to collect. With modern bats and batting, it is not difficult to sustain 8-9 an over through regular sixes and lofted 4s, but it is hard to do so without losing regular wickets. But wickets arrive randomly. Now think of the commentators bias. If a batsman hits a clean six, he is lauded for his good shot. If he mistimes it and is caught, as often as not he will be criticised for "taking unnecessary risk" or "not waiting for the right ball". (Have you ever heard a commentator criticise a batsman for taking unnecessary risk after making a clean hit for 6?) This creates an impression that the good shots are normal, and the wickets are just an avoidable failure rather than both being natural consequences of a particular level of aggression. Combine that with our recollections of past matches. Sometimes a team chasing 120 off 72 balls will have a randomly good passage scoring at that rate without any lofted shots going to hand, and it will make it look like such fast scoring is easy. At other times, we will see a procession of wickets and we will be thinking how the batsmen threw the game away. It is the first case that sticks in our mind when we make our own assessment of probabilities, and so we inflate in our own minds what the probabilities of winning are when a team is chasing a large total.

Data (even historical data that may become out of date) is a good antitdote to these perception biases.

Implausible counterfactuals

Imagine that Winston Peters had a more clever web team and came up with a nifty infographic. The infographic asked you to input your race. Then it showed how many more jobs each race got in New Zealand in the period 1980 to present. If you'd selected "Asian" at the start, the infographic would tell you that your group got tens of thousands more jobs over the period than your group would have gotten if job growth had been distributed "evenly" instead. If you'd selected "White" or "Maori", you'd see how many fewer jobs your group had gotten, with calls of They Took His Job to follow. A parallel one would show Kiwis how many more houses they'd have if growth in the number of houses owned, by race, had been evened out.

It would be pretty obvious that the infographic was nonsense. Immigration from Asia over the period meant that we had more growth in the number of employed Asians. If those migrants hadn't come, it's not like we'd have had more jobs left for everybody else: that's the lump of labour fallacy. The counterfactual is implausible. 

Well, the NZ Herald's nifty new infographic asks you to input your household income and some basics on your household composition. It then tells you which decile you're in. That's all fine. But when you then hit "Next", you get this. I'd inputted $5000 per week as an arbitrarily large income to get a top-decile household. 
Over the last 30 years, all incomes have grown, but not at the same rate. A household like yours is now 29% – or $23621 a year –better off than it would be if all incomes had grown evenly.
If you choose a lower-income household, it tells you how much worse off you are than you would have been if growth had been equal across deciles. So the counterfactual is a New Zealand that had the same overall growth over the period, but where all deciles experienced even growth. The problem here is that there is no way of evening out those growth rates without affecting growth rates. Higher income taxes for more redistribution can even things out, but they also reduce total growth. People can take whatever values-based position they want on how much the state should redistribute. But we ought to at least start by recognizing that it's not a free policy. 

The recommended approach is to consider whether or not to include deadweight losses on a case-by-case basis. As a general rule, deadweight losses should be included if they are of sufficient size relative to the overall costs and benefits of the proposal that they are capable of altering the decision as to whether or not to proceed with the proposal.  
Having said this, deadweight losses are notoriously difficult to quantify. Estimates vary from 14%31 up to 50%32 of the revenue collected. Treasury suggests a rate of 20% as a default deadweight loss value in the absence of an alternative evidence based value. Thus public expenditures should be multiplied by a factor of 1.2 prior to discounting to incorporate the effects of deadweight loss. 
If you want to get one dollar from a high decile person to a lower decile person, you should reckon on its costing $1.20. If you want to even out growth entirely, well, those costs are going to be much higher. I'd believe the 1.2 for top income tax ranges from 30% to 40%. It would be an interesting exercise to work out just how much higher the top marginal tax rate would have needed to be to even out growth entirely over the last three decades. I expect that the rate would be really rather high, and that the associated deadweight losses would be rather large as well.

The Herald infographic invites the reader to assume that those deadweight costs don't exist. Dey Turk Er Income, except for that most of that income wouldn't have existed for anybody in the counterfactual.

There are a few policies around that can improve outcomes in lower income cohorts at relatively low cost. Improving education and training is one. But there are no policies that would equalise income growth across deciles without simultaneously substantially affecting total growth.

Train sets

There's one good thing we can say about Labour's $100m light rail plan for Christchurch: at least it isn't a $1b light rail plan. Otherwise, it's not so hot. 

I'm blogging from Hong Kong, where I'm attending the Mont Pelerin Society's meetings and greatly enjoying their fantastic rail service. Commuter rail and an extensive and efficient bus network are pretty critical to this place's working: there's no way this many people could move around without it. The city is dense and compact. Christchurch is, well, the opposite of that.

Liberty Scott provides a few bullet points on Labour's current plan:
  • Christchurch last had the remnant of a local rail service in 1976 when a once daily, yes once daily, service between Rangiora and Christchurch was scrapped because of lack of patronage.  The last regular service (as in all day service like in Wellington) was between Lyttelton and Christchurch, which ended when the road tunnel was opened in 1972 (the rail service only had an advantage over driving over the Port Hills).  Before that, other services were discontinued during the 1960s as bus services proved more cost effective and car ownership rose.  Christchurch's population grew by over 50% in the period between the end of these services and the earthquake.
  • It won't unclog Christchurch's roads.  The Press report says Labour intends the system to accommodate 10% of commuters from the north to central Christchurch.  Phil Twyford says there are 5000 - yes 5000 commuters making this trip (10,000 trips), so it is $100 million for 500 commuters.  That comes to $200,000 per commuter, before any operating subsidies are considered.  In other words, the price of a Porsche 911 for each commuter.  Taking about 400 cars off of Christchurch's roads every morning isn't going to "unclog" them,  it hardly makes a difference.
  • However, what it might do is encourage more people to live further away from the surrounding suburbs closer to the city, because it subsidises living well outside Christchurch.  That's hardly conducive to reducing congestion, nor environmentally sustainable.  It would be far more preferable to focus on finishing renewing the local road network including marking out cycle lanes, than to incentivise living well out of the city.
  • A commuter rail service to central Christchurch can't even go there, as the station is 4km from Cathedral Square, in Addington.
  • The $100 million is to double track the line to Rangiora, and rebuild some railways stations, but not a new central station (which can't be anymore "central" than the old one on Moorhouse Avenue), nor new trains, although the ex. Auckland ones could be relocated, if a depot could be built, and sidings to put them on.
  • The rail service would replace commercially viable and some subsidised bus services, but politicians don't find buses sexy.
  • The service would lose money, a 1000 trip a day railway service is a joke.  Proper commuter trains in major cities carry that number on one train.  
  • If there really is demand for more public transport from the northern suburbs, it could come from commercial bus service.  Clearways could be used for bus lanes and the hard shoulder of the existing and future extended Northern Motorway could be used for peak bus lanes too, if needed.  Trains only make sense if buses are incapable of handling the volumes of demand, and that clearly isn't the case.
  • Christchurch was the first major city in NZ to scrap trams, because the grid pattern street network and low density of the city meant there were few major transport corridors to support high density public transport systems, like trams (and commuter rail).  It was also the first of the big four cities to scrap commuter rail altogether (even Dunedin had commuter rail services until 1982 to Mosgiel).   In short, the geography of Christchurch is as poorly suited to commuter rail as it is well suited to cycling.
Liberty goes on further - read the whole post.

I disagree with him a bit though. He should have used a Tesla S as alternative, not a Porsche. The numbers work out about the same, and the Tesla is electric.

Back when Mayor Parker was proposing train sets, the cost was higher. I'd then written:
The draft city plan has a $400 million rail line connecting downtown to the University campus. It's unclear that there's sufficient demand to justify such investment, but there might be on the City's creation of a proposed new international precinct downtown where international students would be invited to live. Those students currently live within walking distance of campus in a vibrant international hub at Church Corner and Riccarton where I can find great Chinese, Vietnamese and Korean food; Korean butchers and grocers; a Japanese bakery; and, all kinds of other diverse amenities (Korean and Chinese churches, etc). To the extent that the city is successful in moving all the students downtown, from where they'd need public transport to get to University, and so would need the $400 million dollar (more than $3k per household) rail line (or a far far cheaper designated busway), it would be by destroying an existing international hub.

Let's work through some numbers on rail. Suppose that the $400 million is financed through a 25 year bond issue paying 8%. For an annuity paying 8% to have a present value of $400 million over 25 years (in other words, for folks to be willing to give the City $400 million today in exchange for bonds), the annual payment has to be $37.47 million. The building costs alone for the rail line are then $103k per day for the next 25 years. And, suppose further that we're willing to subsidize each rail rider by $10 per ride. We'd then need 10,000 people riding the train every day just to cover the capital cost where we're willing to pay $10 per person per ride. By way of comparison, RedBus, which services most of Christchurch, carries 5.8 million passengers per year - an average then of just under 16,000 passengers per day. If a single rail line from downtown to the University carried as much traffic as the entire RedBus network, the effective per-passenger capital cost subsidy would be $6.50. If the train were running on a cost recovery basis, it would need to charge $6.50 per trip plus running and maintenance costs. If it covered only running and maintenance costs, the government would be kicking in $6.50 per trip. If it carried as much traffic as the entire RedBus network.
I was pulling punches there a little as I had the distinct impression that the University really kinda wanted that rail line and wouldn't appreciate staff saying otherwise; I was likely just paranoid.

When February's quakes hit, the bus routes changed quickly: the main depot was knocked out, so they ran temporary bus exchanges on Bealey Street and elsewhere. Road closures for repairs meant frequent re-routings. You can't do that with trains.

I also note that Labour's plan suggests some cost-sharing with Christchurch Council. Christchurch Council has no money for cost-sharing arrangements.


Friday, August 29, 2014

How do you mitigate a problem like a NIMBY?

I think I might have a partial solution to NIMBY blocking of urban intensification: a way of paying them at the margin for disamenity effects.

The one-line version: if your neighbour develops, your taxes drop.

Here's how we do it. Or at least the initial sketch-outline blog version of it. I'll expand on it later and, hopefully, fix the problems with it that you'll helpfully point out.

Consider a city of 10,000 dwellings and 12,000 households. Most of these dwellings contain one household, but some contain two households because there are more households than there are dwellings. The City collects $10,000,000 in taxes, with a $1,000 per-dwelling tax, on a standard Council rates system: the Council specifies how much money it needs to collect and that amount is apportioned across dwellings based on the relative value of the dwellings. Dwellings with higher total capital valuation pay more in tax. In this case, they're all identical for simplicity of exposition but nothing requires that they be identical or pay identical taxes.

Suppose that, in this set-up, somebody wants to put up an apartment building that would contain 100 dwellings to house 100 households. The developer pays Council a development levy that covers the building's interconnection costs: the costs the building imposes on Council. Since people would move into this building from existing overcrowded dwellings, there's no additional cost on Council of additional capitation-based services. Specify for now that each of these apartments has the same capital valuation as existing dwellings for simplicity, though again, that will vary in the real world. Council still needs to collect $10,000,000 in taxes in total to cover those services, so long as it's set the development levy correctly.*

Under the existing system, the $10,000,000 in taxes will now be spread over 10,100 dwellings rather than over 10,000 dwellings. Each dwelling consequently remits $990 in taxes. If the neighbours of the apartment building get more than $10 in disamenities from the apartment building's existence, they will lobby against its construction.

Now the RMA has some mechanism for identifying neighbours who are affected by the new development. Maybe some experience more traffic, maybe some lose a bit of view, and maybe others lose a bit of neighbourhood character. Specify that these effects, for this apartment building, extend over 100 dwellings in a circle around the new apartment building. Again, in the real world, it won't be a circle, but it doesn't matter. The RMA and Councils already have some mechanism for identifying affected neighbours; whatever that mechanism is has, in this case, identified these 100 dwellings.

Council needs to raise $10,000,000 in total, but nothing says that we need to spread the abatement provided by the new apartments to the city as a whole. In fact, on thinking about it, it seems pretty silly to spread the abatement so broadly. We've identified a set of affected neighbours who bear the costs of the new development but get the same tax abatement benefits as everybody else. Why not define a Special Ratings Area by the dwellings that experience disamenities from the new development, using whatever process is already in place for defining affected neighbours?

Let's instead specify that the total rates collected from both the new development and all the affected neighbours remains constant after the new development's construction. Those 100 dwellings used to remit, in total, $100,000 in taxes: $1000 each. Dwellings in the circle paid $100,000; dwellings outside of the circle paid $9,900,000. Outside of the circle isn't affected by the apartment building. We'll say now that all of the dwellings inside the circle, including the dwellings in the apartment building, have to remit $100,000 in taxes in total. Since there are now 200 dwellings in the circle instead of 100, the per-dwelling levy is now $500 instead of $1000. The dwellings outside the circle continue to pay $9,900,000 and the necessary $10,000,000 is collected in total. Now, neighbours would need to enjoy more than $500 per year in disamenity effects in order to wish to block the development.

This doesn't solve every problem in the world. There are neighbours who would experience more than $500 per year in disamenities and would still NIMBY up. But there will be a range of neighbours in the $10 to $500 range who cease their opposition.

If we wished a stronger counter-NIMBY effect, we could say that all dwellings inside the circle remit in total the necessary $100,000, but that the new apartments are levied at the rates that obtain outside of the circle. Only the affected neighbours then enjoy the benefits of the Special Ratings Area. The total amount collected will be the same. But, in that case, and in this example, the new apartments each remit $1000 in taxes while the 100 affected neighbours each see a complete rates abatement. So we would only hear complaints from NIMBYs experiencing more than $1000 in disamenity effects.

If the apartment development were large enough, and if the number of affected neighbours were small enough, one could imagine scenarios where the neighbours received a negative rates bill: had there been 150 apartments each remitting $1000 in taxes, and the same number of affected neighbours, there would have been $50000 in surplus to distribute among the 100 affected neighbouring dwellings: a $500 cash bonus each instead of a $1000 rates bill. In that case, it would take $1500 in disamenities to trigger NIMBY activity.

I doubt you would want that this be locked in in perpetuity.** I would expect we could see this system apply in the first year. Perhaps after 10 years, the circle as a whole, including the apartment, could remit a total rates bill equal to a half-way point between the total amount remitted inside the circle prior to the development and the total amount that would be remitted had every dwelling inside the circle, apartments included, paid the same amount as those outside the circle.

The steady-state for the circle going from 100 dwellings to 100 dwellings plus 100 apartment-dwellings could then be $150,000 in total taxes rather than $200,000. Prior to the development, the 9900 dwellings outside the circle remitted $9,900,000 in total taxes; now they'd only need to cover $9,850,000, so their rates bill would drop from $1000 each to $995 each. Each of the 100 apartments would remit the same $995 in taxes, covering $99,495 of the circle's $150,000. The remaining dwellings in the special ratings area would remit $505 each in taxes. Everybody's better off. Affected neighbours get strong abatement. Other pre-existing dwellings see a small amount of abatement too. And we reduce overcrowding because we have found a way of compensating the NIMBYs.

Now real world ratings systems are more complicated than this. More valuable dwellings remit more in tax. What I'm here establishing is a new Special Rating Area within which the city could apply its standard differential progressive capital value taxation scheme, charging more valuable dwellings a greater share of the amount that needs to be collected and less valuable dwellings a smaller proportion. It's just that instead of applying it over the city as a whole, they carve out areas around new developments as defined by the affected neighbours, and re-apply the standard apportionment formula to levy a total amount of rates across dwellings within that defined area. The rates bill for those in the area has to drop relative to what they pay in the current system, and NIMBY pressure consequently drops too.

Note further that these kinds of benefits should be stackable. If your dwelling is affected by two different new developments, you should see cumulative rates decreases.

Questions for readers:

  1. Does a system like this apply anywhere in the existing world?
  2. Are there obvious gaping holes that I'm missing?
  3. What seems like a fair and politically sustainable time path for the special ratings area?

I'm sure there are many practical implementation issues like the calculations for dwellings in overlapping special ratings areas. And maybe we'd want gradations within the Special Ratings Areas where the most affected dwellings see the most abatement. But this all looks pretty feasible.

It seems like a good idea. Surely somebody has thought of this before. And surely somebody else has explained why it can't work. I'll look forward to your pointers.

* In the real world, they could under- or over-shoot. I've heard many arguments that Councils currently have incentive to over-shoot because doing so shifts the tax burden to new residents over existing ones and to discourage development to avoid NIMBY complaints. I can deal with the latter problem here, but we'll otherwise assume that the developer levies are set correctly.

** And especially where new dwellings might cater to new residents rather than for a shuffling of existing ones: the Council's total budget then has to increase for services that have a per-capita cost, and we don't want to give those outside the circle strong reason to lobby against the new development.

Thursday, August 28, 2014

Independent is an interesting word

I presented to the Ministerial Forum on Alcohol Advertising and Sponsorship a few months back with a brief submission on recent evidence on the effects of alcohol advertising on consumption behaviour.

One pretty compelling recent piece of evidence is Jon Nelson's recent meta-analysis, published in 2011. The abstract:
This paper presents a meta-analysis of prospective cohort (longitudinal) studies of alcohol marketing and adolescent drinking, which accounts for publication bias. The paper provides a summary of 12 primary studies of the marketing–drinking relationship. Each primary study surveyed a sample of youth to determine baseline drinking status and marketing exposure, and re-surveyed the youth to determine subsequent drinking outcomes. Logistic analyses provide estimates of the odds ratio for effects of baseline marketing variables on adolescent drinking at follow-up. Using meta-regression analysis, two samples are examined in this paper: 23 effect-size estimates for drinking onset (initiation); and 40 estimates for other drinking behaviours (frequency, amount, bingeing). Marketing variables include ads in mass media, promotion portrayals, brand recognition and subjective evaluations by survey respondents. Publication bias is assessed using funnel plots that account for ‘missing’ studies, bivariate regressions and multivariate meta-regressions that account for primary study heterogeneity, heteroskedasticity, data dependencies, publication bias and truncated samples. The empirical results are consistent with publication bias, omitted variable bias in some studies, and lack of a genuine effect, especially for mass media. The paper also discusses ‘dissemination bias’ in the use of research results by primary investigators and health policy interest groups.
So he picked the papers that use a baseline and exposure design and concluded that there's really nothing much there except for publication bias.

The panellists didn't seem particularly friendly or unfriendly. Tuari Potiki asked why economists' conclusions on this stuff vary so much from the public health folks who'd presented earlier in the day, and the general tone of the Forum members seemed to be "what additional restrictions should we place" rather than "do any potential restrictions do more good than harm", but maybe I misread them.

Well, the anti-alcohol advocates didn't think the Forum was independent enough so they've made their own forum.* They're calling it the Independent Expert Committee on Alcohol Advertising and Sponsorship.

Independent's an interesting word, since the IECAAS is being hosted by Alcohol Action NZ, Doug Sellman and Jennie Connor's anti-alcohol lobby group, and consists of Sellman, Connor, Janet Hoek, Mike Daube and others. They reckon the Ministerial Forum, including NZ Drug Foundation's Tuari Potiki, wasn't independent enough because the CEO of the Advertising Standards Authority is also on the Forum. Sellman et al are correct that the Forum members aren't experts in alcohol marketing, but I'm really unconvinced that that makes them less independent.

IECAAS writes:
To date IECAAS members have found no significant new research that would invalidate the recommendations made by the Law Commission in 2010. In fact the evidence supporting major reform appears to be strengthening. The recommendation to phase out alcohol advertising and sponsorship apart from objective written product information over five years is therefore as important today as it was when first reported to the government in 2010. The only difference is that New Zealand could have made several years of progress had the government responded.
I wonder how hard they've been looking. There's a reasonably important piece in the Journal of Economic Surveys that they've missed. And a few others.

* I can't stop imagining Bender setting up his own theme park. Except this one would be way less fun than Bender's.

Tuesday, August 26, 2014

Reader mailbag: restrictive covenants edition

If the particular character of a neighbourhood is all that important, why don't residents protect it using covenants?

A reader emails me:
I don’t think it is Nimbyism if a neighbourhood wants to protect its own character. What is Nimbyism is denying others beyond your neighbourhood the same opportunity you had.
It seems counter intuitive to think a place like Houston which has few zoning laws gives local communities greater control to enable the protection of individual property rights by allowing those individuals to collectively agree to covenant those rights (which include the protection of special character areas like Franklin Rd) and yet not to interfere with others who may wish a different way outside that zone.
High density advocates hate the idea that Houston communities that fringe CBD areas can continue to live a lifestyle that they have agreed to and also stop others (like Dhyrberg) from coming in and destroying it.
I know that many new developments come with covenants restricting future use of the property: developers expect that residents want rules binding both themselves and their neighbours. I don't want to live in that kind of place, but in a world of heterogeneous preferences, some prefer homogeneity.

Is there anything legally that would stop residents in places like Epsom, Grey Lynn, or any of the other hotbeds of development discord, from jointly agreeing to bind themselves against future development?

Under the status quo, everyone on the street seems to have been given a property right in what anybody else does with their property even though no covenant was put in place. It's an odd conception of property rights to say that, because I bought my house with certain expectations of what my neighbours might do, I therefore am allowed to veto anything they may wish to do with it.

Imagine some street where most residents put value on the street's current character; some on the street would prefer to turn their houses to higher-density use. The current rules let the character-amenity people shout a lot and block the development; those wishing to develop have to pay off all the potential veto players in order to prevent their blocking. Shouting is cheap and, since a developer would have to pay off every potential shouter, there is incentive to pretend to care more than you really do. I'm sure much of the shouting is genuine. But we have little sense of the real dollar value of the experienced disamentiy.

An alternative framework would have those who love the neighbourhood's particular character draft up a covenant agreement and try to get all the owners to sign on. If there are neighbours who were set to re-develop instead, they'd either not sign and not be bound, or be paid by their neighbours to take on the covenant's provisions.

Coase tells us that in low transaction cost environments the two scenarios should be equivalent. Coase also tells us that all the interesting action is in the high transaction cost real world. Is it cheaper to overstate your preference against a neighbour's re-development, or to overstate your willingness to turn your house into a 3-storey set of condos to try to induce payments not to? The former is pretty easy. The latter generally takes a set of architectural and engineering drawings plus building consent applications.

I wonder whether it would be workable to do away with neighbours' ability to object to anything other than real environmental effects like shading by replacing the regime with a menu of covenant options that neighbours might wish to impose upon themselves consensually.

Thanks to my correspondent for useful discussion.

Monday, August 25, 2014

Somebody arbitrage and fix please

I've been explaining to folks 'round the office why we might wish to pay more attention to iPredict's markets on who will be Prime Minister than to the vote share markets. And I thought I might share it with you.

National's back up over 70% in the PM.National contract. If National wins, that contract pays $1; if they lose, it pays $0. It dropped into the 60s last week during the publicity around the Hager book, but it's now back up.

But, if we look at the major party vote share markets, it's hard to see how National could possibly be 73% likely to win. National's predicted to get 43% of the vote; Labour and the Greens are predicted to get 43% of the vote; minor parties get 14%. While NZ First may be more likely to go into coalition with National, Internet/Mana isn't, and Conservatives' wasted votes, if they get 4.4%, disproportionately waste votes that otherwise would have gone to National.

There's a bit of a problem in all the vote share markets though. You can only bring so much money into iPredict at a go, and folks might there be liquidity constrained. The winner-take-all markets can then just be more interesting. The VS markets, paying off at a penny for every percentage point earned by the party in the election, give little chance of large gains or losses. You can sink a whole pile of money into that market to get maybe a cent or two's return on a 43-cent investment. It's not all that great. The PM markets provide a less certain return, as there's bigger chance of large losses if your expectation of the probability is wrong or if the wrong side of the weighted coin turns up, but the 70 cent investment either gets you a dollar or it gets you nothing.

How can we tell that it's the continuous payout structure? iPredict also has a market where the National Party vote share pays out in buckets: one contract pays $1 if the vote share is over 43% (and $0 otherwise), another at $1 if the vote share is over 43.5%, another for 44% and up, and so on through 49%.

In the vote share (continuous) market, you pay $0.43 for a contract giving you $0.01 for every percentage point of the National vote. In the vote share (discrete) market, at current prices, you would pay $0.90 for a contract paying out at $1 if National gets more than 43% of the Party Vote. You'd pay $0.83 for a contract paying $1 if National gets more than 44% of the Party Vote. You'd pay $0.67 for a contract paying $1 if National gets more than 45% of the Party Vote. You'd pay $0.59 for a contract paying $1 if National gets more than 46% of the Party Vote. 46.5% is at $0.55 and 47% is at $0.48. So the market, in those bucketed contracts, expects National to get between 46.5% and 47%; the parallel Labour ones have Labour getting between 28% and 29%. That's rather more consistent with a 70% chance of National's forming government.

But watch for Winston. NZ First is at 4.8% in the standard vote share market, but he's also odds-on to take more than 5.5% of the vote. The Conservatives only have a 29% chance of topping 5%.

Somebody with time ought to go in and fix things so there isn't free money sitting between the bucketed and continuous vote share markets.