Introducing the "Balcony Index".

A tentative tool to measure the strength and sustainability of an NA.

I've been writing here a lot about strong and weak NAs, hosting numbers and statistics. Now, how do you define a strong NA? To measure whether an NA is strong, sustainable and supports the rest of the organization, I would suggest the following four criteria:

1.) Hosting numbers: Sending a delegation is way easier than organizing a camp, so I guess on counting hosted programmes is better than counting sent delegates.
2.) Number of chapters: The more chapters, the more sustainable an NA is. If one chapter goes down the drain, another one may grow instead. If one goes bankrupt, another may help out. The more (independent) chapters, the better.
3.) EEC members: A strong NA should be able to contribute to the international work done. This becomes most obvious in the people that are in the top leadership positions of the organization.
4.) Mosaic: International camps are a matter of hosting and sending delegations. A strong NA runs a year-round programs for locals as well. If a kerosin-tax is introduced, international travel gets more expensive, international camps will go down. Mosaic will stay.

Now, many organizations use indexes to measure criteria that are otherwise hard to grasp:
ESPN recently presented the Soccer Power Index to rank national teams. Transparency International regularly sorts country by the Corruption Perception Index and The Economist invented the famous Big Mac Index.

So, to bring all those criteria listed above that characterize a strong NA together in one number, I herby suggest the use of the "Balcony Index (BI)". It shall be calculated as follows:

BI = Hosted International Camps + Chapters + EEC-Members + Mosaic Projects

And here are the results for 2008, reduced to the top 20 (full table here):

USA    58
SWE    48
ITA    45
NOR    32
BRA    31
CAN    27
FRA    27
GER    27
DEN    26
AUT    15
PHI    14
FIN    13
ESP    12
POR    11
COL    10
JPN    10
EGY    8
GBR    8
MEX    7
NDL    6

The ranking goes pretty much inline with my personal impression regarding the "strength" of the NAs - just France seems  little too high up there. Interestingly there is somewhat of a gap after Denmark, so I think it's reasonable to speak of a G9, representing the strongest NAs: USA, Sweden, Italy, Norway, Brazil, Canada, France, Germany and Denmark.

Obviously, the BI has a few flaws:

- Interchange isn't represented at all.
- The four factors are not independent, in fact they depend on each other
- "financial strength" is not part of it, which may play a huge role in sustainability.
- Also there is no criterion for something as "brand awareness", "publicity" or "media attention".

Any suggestions for improvement are welcome.

Sidenote: When I presented the idea of the Balcony Index to Bebbe from IO during the RTF in Hamburg earlier this year, he suggested adding "number of cisv-shirts printed.", which I thought was funny and even a bit logical, but hard to track down.

Nice initiative Nick! Being a fan of statistics and these kind of posts, I have a few questions I think you should consider:

1. What defines a chapter? Chapters that host, chapters that send? Using this index, Austria for example (nothing against Austrians) got 3 more points (equivalent to hosting 3 camps!) when decentralising to 4 chapters. Or Denmark, which (I think) opened some chapters for fundraising purposes.

2. Programmes measured equally? It is a good measure in terms of human resources, but in terms of financial stability this is ambiguous, but making it more complicated could make the Balcony Index less readable and maybe less interesting.

3. Mosaic is a very good measure in my opinion. Again the human and financial resources needed to run Mosaic projects are way less than those needed to host a camp. And in many chapters, Mosaic depends on one or two people, so not sustainable at all. Also, many projects like these are held by JB's in a non-official way. A good measure for the BI could be based on JB participation (JB trainings hosted, regional meetings hosted..)

4. Hosting AIM? It's no easy task as we know. It could be a good measure, despite the fact that few NA's have had the opportunity to do it. And AIM participation can also show strength.

5. Committee members: I'd go for full members instead of just EEC. Often, NA's have a lot of involved people but not many are in EEC. Ex:

Italy: 0 EEC, 7 Full members
USA: 1 EEC, 7 Full members

and on the other hand:

Costa Rica: 2 EEC, 2 Full members
Israel: 1 EEC, 2 Full members
Argentina: 1 EEC, 1 Full member
Norway: 4 EEC, 4 Full members.

(using the Committee personnel doc from AIM09)

Maybe something like a AIM Participant/EEC or Full member/EEC ratio could be better.

These may be some silly points, but looking at the list I was quite surprised with some countries' positions, mainly GB and Finland..

Cheers Nick, good initiative indeed!

A more xkcdish BI, which also considers stability/growth would be:

LaTeX:
\documentclass[11pt]{article}

\usepackage{amsmath}

$BI = \sum_{n=0}^{10} \frac{Programmes hosted in year (Current year - n)}{n+1} + (Number of Chapters) + (Number of Chapters opened or closed in the past 10 years) + \frac{Number of EEC member years in the past 10 years}{10} + \sum_{n=0}^{10} \frac{Mosaic programmes in year (Current year - n)}{n+1}$

\end{document}

MathML:

B
I

=

n
=

0

10

Programmes hosted in year

Current year
-
n

n
+
1

+

Number of Chapters

+

Number of Chapters opened or closed in the past
10
years

+

Number of EEC member years in the past
10
years

10

+

n

=
0

10

Mosaic programmes in year

Current year
-
n

n
+
1

Damn, it stripped the MathML tags :-(

Flo, could you send me a screen shot of your suggestion, and we'll post that here?

Hey,
nice chart!
I'm a bit confused, what do you mean it doesn't include interchange? To me it looks like it does, at least when I look at the Finnish figures.
I guess it could look different if you remove IC, at least in Finland it's very unpopular at the moment, most chapters don't even try to host them.

i like flos idea, it looks good but i'd also add some sort of weights to the BI. those obviously depend on your priorities but here would be my suggestion:
BI = 0,4*Hosted International Camps + 0,25*Chapters + 0,15*EEC-Members + 0,2*Mosaic Projects
Meaning that not all parts of BI are equally important to be a big/sustainable NA... just an idea

Combining all those comments into a new formula to be seen here!

Thanks for all the comments so far:
- Kaisa, I think you are right, Interhcanges ARE included. And it should be, if you consider all the family and community involvement.
- Tuca (at Facebook) says that national activities should be included. Agreed! But I think national activities should try and get a "Mosaic-status" for quality purpose. Then they would be counted.
- Paul: Weighting different indicators is a good idea, but I wanted something simple to start off with.
- Ze: Fantastic comment with lots of interesting numbers. Again, I'm afraid we need to strike a balance between a reliable index and an easy tool to calculate. The question of what defines a chapter is something that has been bothering me for ages, but no solution at the horizon, I'm afraid.
- Flo: It seems as if your formula gives an average of the past 10 years. But I'd be interested how the Balcony Index changes over the years: Has Austria climbed up the scale since 1999? Probably yes. I'm working on some more graphs to display exactly that hopefully soon.

More thoughts:

I'm not a statistician, and I have no experience in designing indexes. I just threw some thoughts together in order to see what I get.

To do this the right way, I guess we would have to define exactly what we are trying to measure: A sustainability index? A strength index? Should the index be able to predict how that NA will be doing in 5 years? Once a better definition exists, you could look back into the past and test different indicators in their ability to make these kind of predictions. You would then also be able to weigh indicators in their ability to predict.

We would also have to be specific what we would want with the Index: Detect NAs that need support before they crash? Find best practises in NAs that are successful (i.e. have a high BI?)

I'm also uncertain about the interdependence of the indicators. It all interacts: Number of chapters, number of hosted programmes, etc. Maybe one indicator alone could be good enough to explain it all.

This is the point where my understandings reach their limits: Does it make sense to have an index that predicts, whether the index will still be high in the future? To say it differently: If you include number of hosted programmes into the index, does it make sense to use the index to predict whether an NA will still host a number of programmes in the future? Very difficult. If anybody has more background on this, I'd welcome some input.

Just a quick thought.

I think that the core is to see how many programmes we host. So I would suggest to look at how steady and how many programmes a NA has hosted in a period of the past 5 years. The steadier the programme hosting is the more sustainable a NA is. Does and would that make sense?

The strongest NAs would then be the ones hosting the most in the most steady way.

I think that all programmes should be included in this - also JB.

Flo: You need to subtract for closed chapters.

Cancelled programmes should also count negatively... given that the BI is going to be high for those hosting a lot I suggest that it should be counted as the sum of cancelled programmes. Same weight as hosting a camp.

Meaning that if you planned to host two but cancelled one then you get 0. :)

Stability would also be nice... some variance calculation based on hosting per year. Need to think about that. Variance is not necessarily bad if it means stable growth...

I agree with what Tore says, but wonder what would "count" as a JB programme within the balcony index? From my own experience in the USA, we host 4-5 "national" activities each year and have begun hosting "regional" (within the USA) activities (there were four last year). Does everything we host count for something in the equation? What about single-chapter NAs? Could their activities skew the equation?

@Lars: Cancelling sounds like a good "ngeative indicator" of an NA with a lot of problems - and also creating problems for others, thinking especially of interchanges.

@Martin/Tore: I like the idea of including JB to the index: A strong JB is a good indicator of future leadership. But how? In the present index, simply the nationality of the IJRs count. Also, it should be assumed that a bunch of Mosaic projects are run by juniors. But this doesn't seem enough. Counting members of the IJB-team would indicate strong leadership. Counting the number of JB activities would be great, but is difficult to measure - However, this could change: I think the IJB-team should try and quantify the amount of JB camps held around the world: How many people participate? Maybe they could even come up with IJB categories (A,B,C...similar to the NA categories). I think the BI reveals here, that besides being already difficult to measuer the strength of an NA, it is even more difficult to rate a JB.

One more thing:
I was considering to add something like the number of Google search results as an indicator of "hype". Google trends (as mentioned in an earlier post) gives some ideas: 10 countries are listed here - they could be awarded with extra points in the BI.

are you all familiar with the butterfly effect theory? The phenomenon whereby a minute localized change in a complex system can have large effects elsewher (i.e. a butterfly moving its wings in Chicago can change the weather in Budapest).

My point is we cannot come up with a BI without taking into considerations every single aspect of CISV and how it works. And since it is a very complex system of working and a very "butterfly effect" one (ex. there is a limited number of camps to be hosted each year in each region, and an NA might want to host a certain type of camps or more than one type, but can't because of the limitedness of the places ==> here we start going into what an NA wanted to do and couldn't do and how ready it was or it wasn't for what it wanted to do).

So the factors to be included in the BI are limitless and we cannot cater for every factor (and going into the internet "hype" door takes us to another set of factors to be accounted for such as website, hits on JBpedia, number of CISV blogs by members of that NA etc... and many many other things we can include).

The butterfly effect is a very complicated theory to be applied in a CISV BI and we shouldn't go too far into the calculation of the BI and including so many factors in it. Maybe what we can have is a set of sub-indexes where we will have an Internet Index II, a Hosting Index HI, a JB Index JBI etc... and they all add up to form one BI at the end.

The ways we can tackle this are endless, and assumptions and limits should be drawn out or else we would be complicating the BI more and more.

And one last point, the BI in its original form came out with a small number of NAs with an actual BI number and a majority of NAs with BI 0 or 1 - so it only shows us the strongest strongest NAs and does not compare the bigger rest of the NAs and see how well they are performing. Unless all the other NAs are 0 performing, and it's acceptable that they turn up with a BI of 0, but then it's a problem to have all these NAs with a 0 BI ==> it shows how much we are dysfunctional as CISV in some countries (or a big chunk of them).

JB in BI: I know how difficult it can be to meassure JB, its activities and of course also the impact. So putting it into a BI would definitely not make it any easier. However I know that during the past years the IJRs have collected such information in the NRF. Like number of JB'ers. Number of activities etc. and I also know that the IJB-Team is currently working on improving this system. So please if you have any thougths, please mail them to the IJB-Team or the IJRs. Anyway, I think taking "straight" numbers is the only way to make such a static BI, which could be fun.

"Butterfly effect": I kind of like the way of having a set for each thing in the BI, but I think this is kind of what is happening already. But most of all I think it is important to point out that this is basically just Nick being super (sorry for the expression Nick, I don't know you) "statisc nerd" and are making some thoughts on how CISV works. And this is what I like so much about the "Balcony" that we can all throw in ideas an collaborate on them, without too much "Board talk". It is okay too look at things from new angles and try out stuff that we couldn't do through official channels. Thanks a lot Nick. So, I guess my point is that this BI shouldn't get too complicated.

On the butterfly effect: What we're talking about is nothing else but the "Chaos Theory". I've never managed to comprehend the theory fully, despite reading a whole book on it, but from what I understand, is: Some systems are so complex, it is impossible to set up rules to explain them - ever.

I agree that next to the weather and economies, CISV is surely also a chaotic system. But just like the weather and economic forecasts people try to gather indicators of what will happen in the future - with a certain level of uncertainty.

So, I never expected the BI to "explain" and predict the development of CISV. Right now, my humble goal was merely to create an index to measure the success of our NAs. With a little tweaking, my hope is, that the development of the BI could also give pointers to which NAs are being very successful (rising BI, what did they do right?), or unsuccesful (falling BI, what went wrong?).

I wasn't trying to suggest that the BI becomes an official indicator of CISV performance of NAs and I know it is a "statical nerdy CISV-devil like" idea of nick to explain who is doing good and who isn't.

I was just suggesting that we do not add more stuff and stick to what we have as an equation of the BI for now, because we will be falling in the endless list of what to add.

Anyways, Nick, do you think you can manage to calculate the BI of all countries from your original BI and the latest one Paul suggested? It'd be very interesting to see how every NA is doing BI-wise.

@Hani - I absolutely agree, that it's just an idea at the moment and shouldn't get to complicated. I'll be posting some more numbers on the BI as soon as I have the stats sorted out.

Moni/GUA posted some more suggestions on what to add over at Facebook:

General Population
Socio-economic situation of countries (GDP, illiteracy, purchasing power, etc.)
Governmental or nongovernmental Help
Financial soundness of the NA's

All of these, I think, would be very good indicators of the sustainability of an NA. Right now the BI is more onto "performance".

Well, it gives a biased average over the years - with the last year being the most important, and earlier years becoming less weighted in the total. But yes, it's a kind of "average".

That's what i did - it says "opened or closed". I assumed that a "closed" chapter would be -1, and an opened one +1.... sorry for being unclear ;-)

Sorry Paul, but your last proposal makes JB account for 60% of the BI.
I know how much we think JB rules the CISV world, but isn't that a bit too much ?

Another thing is, if we're looking at the steadiness we should include past years camp, whereas if we're looking at influence we should consider Zé's numbers that actually mean Norway and Costa Rica have way more influence than Italy or USA.

And finally should we consider the number of inhabitants of the entire country ? I really think countries like Denmark, Israel or Costa Rica are stronger in their own countries than what we are in France for instance : we have more manpower but are far more diluted.

Oh and btw, happy new year everyone :-)

PS : I've always considered chaos theory as a good excuse of not trying to modelize something ;-)

What about measure the strongest NA in a international NA soccer cup? It would be, probably, less chaotic.

I think the "hype" factor (the one that includes google searches) is quite dangerous couse every country has a different relation with internet and mediatic behavior.

I like Clemzi's toughts about influence level of each NA on its respective nation. What I see is the butterfly effect happening on a idea that started simple and is getting more complex.

My shy suggestion is: try to separe influence level from the real size of NAs, than we could reach two different indexes. One index is the original BI, related only to the size (number of camp hosts and people) of every NA. The secound index would take care of influence level, brand awareness, media atenttion and publicity. Than we could compare the two indexes and see if actions of communication can really make a NA lift off.

But don't ask me what to do with Mosaic and JB, still didn't figure it out.

PS- two indexes might be even more cahotic than a single tough one. damn butterfly!

peace

Although at first I do agree with Clemzi's concept of "diluted manpower", we have to keep in mind what we want with BI. Is it going to measure how strong CISV is in a country or how strong is this country is in CISV? In other words: do we want to know if CISV has the power to change a country or do we want to know something like: "if this NA simply disappeared, how much would we miss them?".

For some quick comparison, I simply divided the number of international activities hosted by the population, for all NAs that hosted an international activity in 2008. This gives intersting numbers (the result given below were multiplied by 10.000.000):

Country Int. Activities / Population India 0,01 Indonesia 0,13 Korea 0,20 Philippines 0,22 Japan 0,31 Australia 0,45 Mexico 0,46 Great Britain 0,48 Argentina 0,50 Egypt 0,51 Poland 0,52 United States 1,04 Brazil 1,04 Bulgaria 1,32 Colombia 1,32 Ecuador 1,42 Guatemala 1,43 Spain 1,52 Belgium 1,85 Hungary 2,00 Germany 2,08 Costa Rica 2,18 France 2,29 Netherlands 2,41 Switzerland 2,57 Israel 2,66 Canada 3,23 Czech Republic 3,86 Italy 4,65 Portugal 5,64 Austria 10,75 Finland 11,20 Denmark 19,87 Sweden 26,76 Norway 30,87 Iceland 31,49

Interesting to see USA and Brazil right in the middle, close to Bulgaria. France and Costa Rica appear very close as well, although France hosted 15 activities and Costa Rica only one. Scandinavians are on top - no surprise. And our strongest NA would be Iceland - with only one international activity hosted.

Hmm it double posted and did not accepted the HTML tags to create a nice table. Nick, can you please delete one of the two posts (and this one also).
This is the table:
India 0,01
Indonesia 0,13
Korea 0,20
Philippines 0,22
Japan 0,31
Australia 0,45
Mexico 0,46
Great Britain 0,48
Argentina 0,50
Egypt 0,51
Poland 0,52
United States 1,04
Brazil 1,04
Bulgaria 1,32
Colombia 1,32
Guatemala 1,43
Spain 1,52
Belgium 1,85
Hungary 2,00
Germany 2,08
Costa Rica 2,18
France 2,29
Netherlands 2,41
Switzerland 2,57
Israel 2,66
Czech Republic 3,86
Italy 4,65
Portugal 5,64
Austria 10,75
Finland 11,20
Denmark 19,87
Sweden 26,76
Norway 30,87
Iceland 31,49

Hey Martin, thanks for the interesting numbers. The numbers indicates the "CISV output per capita" measured in hosted camps. You could also calculate the CISVers/population, which would be something like "CISV penetration", but I'm afraid we don't know how many "CISVers" exist in a given country.
Anyway, the fact that Scandinavia is top doesn't come as a surprise, but is an interesting piece of information.

I also like your question concerning the information the BI is supposed to give. I kind of only tried to measure the "performance" of a CISV NA, independent of the situation in that country. But the next step whoch would be way more interesting, would be to find indicators, that are able to predict, how the BI is going to be. Now here I see a place for country size, GDP/capita, CISV penetration, etc...

Martin, I only deleted on of your posts, because I like the list of countries under each other.