Portfolio construction through handcrafting: implementation
This post is all abouthandcrafting; a method for doing portfolio construction which human beings can do without computing power, or at least with a spreadsheet. The method aims to achieve the following goals:
- Humans can trust it: intuitive and transparent method which produces robust weights
- Can be easily implemented by a human in a spreadsheet
- Can be back tested
- Grounded in solid theoretical foundations
- Takes account of uncertainty in data estimates
- Decent out of sample performance
- Addresses the problem of allocating capital to assets on a long only basis, or to trading strategies. It won't be suitable for a long /short portfolio.
This is the 1/3 in a chain of posts on the handcrafting method.
- The first post can befound here, and it motivates the need for a method like this.
- In the second post I build up the various components of the method, and discuss why they are needed.
- In this, the third post, I'll explain how you'd actually apply the method step by step, with code.
- Post four will test the method with real data
This may be a 'dual music' publish; in which I'll define implementations:
- a spreadsheet based method suitable for small numbers of assets where you need to do a one-off portfolio for live trading rather than repeated backtest. It's also great for understanding the intution of the method - a big plus point of this technique.
- a python code based method. This uses (almost) exactly the same method, but can be backtested (the difference is that the grouping of assets is done manually in the spreadsheet based method, but automatically here based on the correlation matrix). The code can be found here; although this will live within thepysystemtrade ecosystem I've deliberately tried to make it as self contained as possible so you could easily drop this out into your own framework.
The demonstration
To display the implementation I'm going to need some facts. This may not be the entire blown actual facts that I'll be the usage of to test the technique nicely, however we do need *some thing*. It needs to be an exciting records set; with the following characteristics:
- different levels of volatility (so not a bunch of trading systems)
- heirarcy of 3 levels (more would be too complex for the human implementaiton, less wouldn't be a stern enough test)
- not too many assets such that the human implementation is too complex
I'm going to apply long simplest weekly returns from the subsequent contraptions: BOBL, BUND, CORN, CRUDE_W, EURODOLLAR, GAS_US, KR10, KR3, US10, US20; from 2014 to the prevailing (considering that for some of these gadgets I only have information for the closing 5 years).
Because this is not a proper check I won't be doing any fancy rolling out of pattern optimisation, just a single portfolio.
The descriptive statistics may be determined here. The python code which receives the statistics (using pysystemtrade), is here.
(I've written the handcrafting features to be standalone; after I come to trying out them with actual facts I'll show you how to hook those into pysystemtrade]
Overview of the technique
Here are the ranges involved in the handcrafting technique. Note there are some options concerned:
- (Optional if using a risk target, and automated): partition the assets into high and low volatility
- Group the assets hierarchically (if step 1 is followed, this will form the top level grouping). This will done either by (i) an automated clustering algorithm or (ii) human common sense.
- Calculate volatility weights within each group at the lowest level, proceeding upwards. These weights will either be equal, or use the candidate matching technique described in the previous post.
- (Optionally) Calculate Sharpe Ratio adjustments. Apply these to the weights from step 3.
- Calculate diversification multipliers for each group. Apply these to the weights from step 4.
- Calculate cash weights using the volatility of each asset.
- (Optionally) if a risk target was used with a manual method, partition the top level groups into high and low volatility.
- (Optionally) if a risk target was supplied; use the technique outlined in my previous post to ensure the target is hit.
Spreadsheet: Group the assets hierarchically
A cautioned grouping is here. Hopefully it's pretty self explanatory. There might be some debate about whether or not Eurodollar and bonds must be glued collectively, however part of doing it this way turned into to peer if the diversification multiplier fixes this capability mistake.
Spreadsheet: Calculate volatility weights
The calculations are proven here.
Notice that for maximum corporations there are handiest one or two assets, so matters are incredibly trivial. Then on the top stage (stage 1) we've three assets, so things are a piece more amusing. I use a simple common of correlations to assemble a correlation matrix for the pinnacle level groups. Then I use a weighted average of two candidate matrices to workout the specified weights for the top stage corporations.
The weights pop out as follows:
- Developed market bonds, which we have a lot of, 3.6% each for a total of 14.4%
- Emerging market bonds (just Korea), with 7.2% each for a total of 14.4%
- Energies get 10.7% each, for a total of 21.4%
- Corn gets 21.4%
- Eurodollar gets 28.6%
Spreadsheet: Calculate Sharpe Ratio changes (optionally)
Adjustments for Sharpe Ratios are proven in this spreadsheet. You have to observe the calculations down the page, as they are carried out in a backside up style. I haven't afflicted with interpolating the heuristic modifications, alternatively I've just used VLOOKUP to in shape the nearest adjustment row.
Spreadsheet: Calculate diversification multipliers (DM)
DM calculations are proven on this sheet. DMs are quite low in bonds (wherein the property in every us of a are notably correlated), but tons better in commodities. The final set of adjustments specifically placing; observe the reallocation from the single tool rates institution (preliminary weight 30.7%, falls to 24.2%) to commodities (preliminary weight 29%, rises to 36.Five%).
Spreadsheet: Calculate cash weights
(Almost) eventually we calculate our coins weights, on this spreadsheet. Notice the massive weight to low volatility Eurodollar.
Spreadsheet: Partition into excessive and coffee volatility
(optionally available: if risk goal used with guide approach)
If we're the usage of a chance target we will want to partition our top level businesses (this is finished routinely with python, however spreadsheet human beings are allowed to select their own groupings). Let's choose an arbitrary danger target: 10%. This have to be conceivable because the common chance of our property is 10.6%
This is the common volatility of every institution (calculated here):
Bonds: 1.83%
Commodities: 14.6%
Rates: 0.89%
So we've got:
High vol: commodities
Low vol: Rates and bonds
(Not a big marvel!!)
The natural risk of the portfolio comes out at 1.09% (calculated here). Let's explore the viable scenarios:
- Risk target lower than 1.09%, eg 1%: We'd need to add cash to the portfolio. Using the spreadsheet with a 1% risk target you'd need to put 8.45% of your portfolio into cash; with the rest going into the constructed portfolio.
- Risk target higher than 1.09% with leverage allowed: You'd need to apply a leverage factor; with a risk target of 10% you'd need a leverage factor of 9.16
- Risk target higher than 1.09% without leverage: You'd need to constrain the proportion of the portfolio that allocated to low risk assets (bonds and rates). The spreadsheet shows that this comes out at 31.4% cash weight, with the rest in commodities. I've also recalculated the weights with this constraint to show how it comes out.
And here are those very last weights (to hit 10% hazard with no leverage):
weight
BOBL 2.17%
BUND 0.Seventy eight%
US10 zero.Forty four%
US20 zero.23%
KR3 7.25%
KR10 1.86%
EDOLLAR 18.67%
CORN 36.Sixty seven%
CRUDE_W 19.47%
GAS_US 12.Forty five%
Python code
The handcrafting code is right here. Although this report will in the long run be dumped into pysystemtrade, it's designed to be absolutely self contained so that you can use it in your own applications.
The code expects weekly returns, and for all assets to be gift. It does not do rolling optimisation, or averages over more than one assets. I need to jot down code to hook it into pysystemtrade, and to achieve those various targets.
The only input required is a pandas data frame returns with named columns containing weekly returns. The main object you'll be interacting with is called Portfolio
Simplest use case, to head from returns to cash weights without threat concentrated on:
P=Portfolio(returns)
p.Cash_weights
I might not document the API or methodology absolutely here, but with a bit of luck you will get the idea.
Python: Partition the belongings into high and occasional volatility
Let's attempt with a risk goal of 10%:
p=Portfolio(returns, risk_target=.1)
p.Sub_portfolios
Out[575]: [Portfolio with 7 instruments, Portfolio with 3 instruments]
p.Sub_portfolios[0] Out[576]: Portfolio with 7 instruments
p.Sub_portfolios[0].instruments Out[577]: ['BOBL', 'BUND', 'EDOLLAR', 'KR10', 'KR3', 'US10', 'US20']
p.Sub_portfolios[1].instruments Out[578]: ['CORN', 'CRUDE_W', 'GAS_US']
So all of the bonds get put into one institution, the other property into any other. Seems attainable.
Using a very high hazard goal is a horrific idea:
p=Portfolio(returns, risk_target=.3)
p.Sub_portfolios
Not many units have threat better than target; portfolio will be focused to hit risk target Out[584]: [Portfolio with 9 instruments, Portfolio with 1 instruments]
This is a fair worse idea:
p=Portfolio(returns, risk_target=.Four)
p.Sub_portfolios
Exception: Risk target more than vol of any device: will be impossible to hit danger target
The pressured partitioning into two pinnacle stage businesses will now not manifest if leverage is authorized, or no danger target is furnished:
P=Portfolio(returns) # no risk target
p.Sub_portfolios Natural top degree grouping used Out[44]: [Portfolio with 7 instruments, Portfolio with 2 instruments, Portfolio with 1 instruments]
p=Portfolio(returns, risk_target=.3, allow_leverage=True) p.Sub_portfolios Natural top degree grouping used Out[46]: [Portfolio with 7 instruments, Portfolio with 2 instruments, Portfolio with 1 instruments]
Python: Group the property hierarchically
Here's an instance while we are permitting the grouping to take place obviously:
P=Portfolio(returns)Natural top degree grouping used Out[48]: [' Contains 3 sub portfolios', ['... Contains 3 sub portfolios', ["...... Contains ['KR10', 'KR3']"], ["...... Contains ['EDOLLAR', 'US10', 'US20']"], ["...... Contains ['BOBL', 'BUND']"]], ["... Contains ['CRUDE_W', 'GAS_US']"], ["... Contains ['CORN']"]]
p.Show_subportfolio_tree()We have three top level groups: interest rates, energies, and Ags. The interest rate group is further divided into second level groupings by country: Korea, US and Germany. Here's an example when we're doing a partition by risk
p=Portfolio(returns, risk_target=.1)
p.Show_subportfolio_tree()
Applying partition to hit danger goal Partioning into corporations to hit danger goal of 0.A hundred thousand
Out[42]: [' Contains 2 sub portfolios', ['... Contains 3 sub portfolios', ["...... Contains ['KR10', 'KR3']"], ["...... Contains ['EDOLLAR', 'US10', 'US20']"], ["...... Contains ['BOBL', 'BUND']"]], ["... Contains ['CORN', 'CRUDE_W', 'GAS_US']"]]
There are actually top stage corporations as we saw above.
If you are a machine learning fanatic who wishes to mess around with the clustering algorithm, then the heavy lifting of the clustering algo is all completed in this approach of the portfolio item:
def _cluster_breakdown(self): X = self.Corr_matrix.Values d = sch.Distance.Pdist(X) L = sch.Linkage(d, method='entire')
# play with this line at your peril!!! Ind = sch.Fcluster(L, MAX_CLUSTER_SIZE, criterion='maxclust') go back listing(ind)
However I've observed the effects to be very similar no matter the method used.
Python: Calculate volatility weights
p=Portfolio(returns, use_SR_estimates=False) # turn off SR estimates for now p.Show_subportfolio_tree() Natural top degree grouping used Out[52]: [' Contains 3 sub portfolios', ['... Contains 3 sub portfolios', ["...... Contains ['KR10', 'KR3']"], ["...... Contains ['EDOLLAR', 'US10', 'US20']"], ["...... Contains ['BOBL', 'BUND']"]], ["... Contains ['CRUDE_W', 'GAS_US']"], ["... Contains ['CORN']"]]
Let's study a few elements of the portfolio. Firstly the very simple single asset Corn portfolio:
# Just Corn, single asset
p.Sub_portfolios[2].volatility_weights Out[54]: [1.0]
# Just two assets, so goes to equal vol weights p.Sub_portfolios[1].volatility_weights Out[55]: [0.5, 0.5]
# The US bond organization is the simplest interesting one
# Pretty near equal weighting
p.Sub_portfolios[0].sub_portfolios[1].volatility_weights Out[57]: [0.28812193544790643, 0.36572016685796049, 0.34615789769413313]
The Energy portfolio is barely more exciting with belongings; however this can default to equal volatility weights:
Only the USA bonds (and STIR) portfolio has three property, and so will use the candidate matching set of rules:
p.Sub_portfolios[0].sub_portfolios[1].corr_matrix Out[58]: EDOLLAR US10 US20 EDOLLAR 1.000000 0.974097 0.872359 US10 0.974097 1.000000 0.924023 US20 0.872359 0.924023 1.000000
Python: Calculate Sharpe Ratio modifications (optionally)
P=Portfolio(returns) # by way of default Sharpe Ratio changes are on unless we turn them off
Let's look at a simple two asset portfolio to peer how those paintings:
p.Sub_portfolios[1].sharpe_ratio Out[63]: array([-0.55334564, -0.8375069 ])
Python: Calculate diversification multipliers
P=Portfolio(returns)
Python: Aggregate up sub-portfolios
The portfolio inside the python code is constructed up in a bottom up style. Let's see how this happens, via that specialize in the ten 12 months US bond.
P=Portfolio(returns)
Natural top degree grouping used
First the code calculates the vol weight for US bonds and charges, consisting of a SR adjustment:
p.Sub_portfolios[0].sub_portfolios[1].diags
Out[203]: EDOLLAR US10 US20 Raw vol (no SR adj) zero.288122 zero.365720 0.346158 Vol (with SR adj) zero.292898 zero.361774 0.345328 Sharpe Ratio zero.218935 zero.164957 zero.185952 Portfolio containing ['EDOLLAR', 'US10', 'US20'] contraptions
This portfolio then joins the wider bond portfolio (here in column '1' - there are no significant names for components of the broader portfolio - the code doesn't understand this is US bonds):
p.Sub_portfolios[0].diags.aggregate Out[206]: 0 1 2 Raw vol (no SR adj or DM) 0.392114 0.261486 0.346399 Vol (with SR adj no DM) 0.423425 0.162705 0.413870 SR 0.985267 0.192553 1.185336 Div mult 1.038917 1.026137 1.022638 Portfolio containing three sub portfolios aggregate
The Sharpe Ratios, raw vol, and vol weights proven right here are for the organizations that we're aggregating together here. So the uncooked vol weight on US bonds is zero.26. To see why take a look at the correlation matrix:
You can see that US bonds are extra tremendously correlated with asset 0 and asset 2, than they may be with each other. So it gets a decrease uncooked weight. It also has a far worse Sharpe Ratio, so get's in addition downweighted relative to the alternative countries.
We can now workout what the burden of US 10 12 months bonds is among bonds as a whole:
p.Sub_portfolios[0].diags
BOBL BUND EDOLLAR KR10 KR3 US10 \
Vol wt in group 0.519235 0.480765 0.292898 0.477368 0.522632 0.361774
Vol wt. of group 0.413870 0.413870 0.162705 0.423425 0.423425 0.162705
Div mult of group 1.022638 1.022638 1.026137 1.038917 1.038917 1.026137
Vol wt. 0.213339 0.197533 0.047473 0.203860 0.223189 0.058636
US20
Vol wt in group 0.345328
Vol wt. of group 0.162705
Div mult of group 1.026137
Vol wt. 0.055971
Portfolio containing three sub portfolios
The first row is the vol weight of the asset inside it is institution; we've got already seen this calculated. The next row is the vol weight of the institution as an entire; again we've got already seen the figures for US bonds calculated above. After this is the diversification multiplier for the USA bond organization. Finally we are able to see the volatility weight of US 10 12 months bonds in the bond group as a whole; identical to the vol weight in the group, accelerated with the aid of the vol weight of the organization, extended through the diversification multiplier of the organization; and then renormalised to feature up to 1.
Finally we are prepared to construct the top level institution, in which the bonds as an entire is asset '0'. First the correlation matrix:
notUsedYet = p.Volatility_weights
p.Aggregate_portfolio.Corr_matrix
Out[212]:
0 1 2
0 1.000000 -0.157908 -0.168607
1 -0.157908 1.000000 0.016346
2 -0.168607 0.016346 1.000000
All those belongings, bonds [0], energies [1], and corn [2] are pretty uncorrelated, although bonds might just have the threshold:
p.Diags.Combination
Out[208]:
0 1 2
Raw vol (no SR adj or DM) 0.377518 0.282948 0.339534
Vol (with SR adj no DM) 0.557443 0.201163 0.241394
SR 1.142585 -0.871979 -0.801852
Div mult 1.252992 1.278761 1.000000
Portfolio containing three sub portfolios aggregate
Now to calculate the final weights:
p.Diags
Out[241]:
BOBL BUND CORN CRUDE_W EDOLLAR GAS_US \
Vol wt in group 0.213339 0.197533 1.000000 0.539925 0.047473 0.460075
Vol wt. of group 0.557443 0.557443 0.241394 0.201163 0.557443 0.201163
Div mult of group 1.252992 1.252992 1.000000 1.278761 1.252992 1.278761
Vol wt. 0.124476 0.115254 0.201648 0.116022 0.027699 0.098863
KR10 KR3 US10 US20
Vol wt in group 0.203860 0.223189 0.058636 0.055971
Vol wt. of group 0.557443 0.557443 0.557443 0.557443
Div mult of group 1.252992 1.252992 1.252992 1.252992
Vol wt. 0.118945 0.130224 0.034212 0.032657
Portfolio containing three sub portfolios
We've now were given the very last volatility weights. Here's some other manner of viewing them:
# First remind ourselves of the volatility weights dict([(instr,wt) for instr,wt in zip(p.Instruments, p.Volatility_weights)]) Out[80]: 'BOBL': zero.12447636469041611, 'BUND': zero.11525384132670763, 'CORN': 0.20164774158721335, 'CRUDE_W': 0.11602155610023207, 'EDOLLAR': zero.027698823230085486, 'GAS_US': zero.09886319534295436, 'KR10': 0.11894543449866347, 'KR3': zero.13022374999090081, 'US10': zero.034212303586599956, 'US20': 0.032656989646226771
The maximum hanging difference to the spreadsheet is that with the aid of lumping Eurodollar in with the alternative US bonds it has a much smaller vol weight. German and Korean bonds have won as a end result; the energies and Corn are quite comparable.
Python: Calculate cash weights
P=Portfolio(returns)
dict([(instr,wt) for instr,wt in zip(p.instruments, p.Cash_weights)])
Natural top degree grouping used Out[79]: {'BOBL': 0.21885945926487166, 'BUND': 0.079116240615862948, 'CORN': 0.036453365347104472, 'CRUDE_W': 0.015005426640542012, 'EDOLLAR': 0.10335586678017628, 'GAS_US': 0.009421184504702888, 'KR10': 0.10142345423259323, 'KR3': 0.39929206844323878, 'US10': 0.025088747004851766, 'US20': 0.011984187166055982}
Obviously the less risky assets like 3 12 months Korean bonds and Eurodollar get a larger cash weight. It's additionally viable to see how these were calculated from the final volatility weights:
p.Diags.cash Out[199]: BOBL BUND CORN CRUDE_W EDOLLAR GAS_US \ Vol weights 0.124476 0.115254 0.201648 0.116022 0.027699 0.098863 Std. 0.018965 0.048575 0.184449 0.257816 0.008936 0.349904 Cash weights 0.218859 0.079116 0.036453 0.015005 0.103356 0.009421 KR10 KR3 US10 US20 Vol weights 0.118945 0.130224 0.034212 0.032657 Std. 0.039105 0.010875 0.045470 0.090863 Cash weights 0.101423 0.399292 0.025089 0.011984 Portfolio containing 10 instruments (cash calculations)
Python: Check threat target is hit, alter weights if required
The herbal threat of the unconstrained portfolio is pretty low: 1.Fifty nine% (a chunk better than the spreadsheet version, due to the fact we haven't allotted as a whole lot to Eurodollar)
P=Portfolio(returns) p.Portfolio_std Natural top degree grouping used Out[82]: 0.015948015324395711
Let's explore the viable scenarios:
- Risk target decrease than 1.59%, eg 1%: We'd want to add coins to the portfolio.
P=Portfolio(returns, risk_target=.01)
# if cash weights add up to much less than 1, have to be which include cash within the portfolio
sum(p.Cash_weights)
Calculating weights to hit a threat target of zero.010000
Natural top degree grouping used
Too much risk 0.372963 of the portfolio could be coins
Out[84]: 0.62703727056889502
# test danger goal hit
p.Portfolio_std
Out[85]: 0.01
With a 1% hazard target you would want to put 37.Three% of your portfolio into cash; with the relaxation going into the constructed portfolio.
- Risk target higher than 1.59% with leverage allowed, eg 10%
p=Portfolio(returns, risk_target=.1, allow_leverage=True)
# If sum of cash weights>1 we must be using leverage
sum(p.Cash_weights)
Calculating weights to hit a risk goal of zero.100000
Natural top degree grouping used
Not sufficient risk leverage aspect of 6.270373 applied
Out[87]: 6.2703727056889518
# take a look at target hit
p.Portfolio_std
Out[88]: zero.10000000000000001
You'd want to use a leverage factor; with a risk goal of 10% you would want a leverage element of 6.27
- Risk goal higher than 1.59% with out leverage:
p=Portfolio(returns, risk_target=.1)
Calculating weights to hit a risk goal of zero.100000
Not sufficient threat, no leverage allowed, using partition method
Applying partition to hit risk goal
Partitioning into companies to hit risk goal of 0.One hundred thousand
Need to limit low cash institution to 0.005336 (vol) zero.323992 (cash) of portfolio to hit danger target of 0.One hundred thousand
Applying partition to hit risk goal
Partitioning into companies to hit risk goal of 0.One hundred thousand
# have a look at cash weights
dict([(instr,wt) for instr,wt in zip(p.instruments, p.Cash_weights)])
Out[90]:
'BOBL': zero.07548008030352539,
'BUND': zero.027285547606928903,
'CORN': zero.3285778602871447,
'CRUDE_W': zero.19743348662518673,
'EDOLLAR': 0.035645291049388697,
'GAS_US': 0.15010566887898191,
'KR10': zero.034978842111056153,
'KR3': zero.13770753839879318,
'US10': 0.0086525875783564771,
'US20': zero.0041330971606378854
# test danger goal hit
p.Portfolio_std
Out[91]: zero.10001663416516968
In this situation the portfolio to constrain the percentage of the portfolio that allotted to low risk property (bonds and fees).
What's subsequent
In the next post I'll test the method (in it's back testable python format - otherwise (a) the results could arguably be forward looking, and (b) I have now seen more than enough spreadsheets for 2018 thank you very much) against some alternatives. It could take me a few weeks to post this, as I will be somewhat busy with Christmas, university, and book writing commitments!