Jump to content

Jim Simons rennaisance technologies - is value învesting not the only way ?


scorpioncapital

Recommended Posts

I was reading this

 

https://www.vintagevalueinvesting.com/learning-from-jim-simons/

 

Over 30 years (an investing lifetime) he beat Buffett and Soros.

By a big margin.

 

And he did not use value învesting.

Just the opposite . He used technical analysis and computer trading.

 

So is there a better return to be had than value învesting?

What I want to know is not a justification for why quant doesn't work but rather if it works equal or better than value învesting is there another game in town and is it easier and or more lucrative?

Link to comment
Share on other sites

Value investing is hardly the only way to make money. Thats like walking into a garage and declaring a screwdriver the only way to fix things.

 

Many smart guys just prefer it because it acts as a mental valium during turbulent times; thinking they know what the business is "really" worth.

Link to comment
Share on other sites

My understanding of what Renaissance does is value investing. The only way to make money is to buy high and sell low. If you want to keep the money you make, buy below fair value and sell higher.

 

They have automated what people like Buffett used to do manually. Scan a huge number of documents, scan every company, and build the ultimate contrarian investor by programming a machine to do it. What you get out of this process of:

 

  scanning every document/piece of data + a machine-trader  =  a super-Buffett.

 

What I see is hedge-funds who cannot build such a program resort to momentum investing. The "quant" funds such as Renaissance, Two Sigma, Millennium lack the momentum stocks in their 13-Fs. Munger is right that investing has become a lot more competitive.

Link to comment
Share on other sites

We can all learn to be a bit better at some timing dynamics

For example, I bought Antero Midstream recently.  It was an year end tax loss harvesting candidate.  I bought some at $7 and it started to go down on very little fundamental.  Sure nat gas prices are low.  The stock started out the year at $14 and at $5, it has a ton of tax loss value. So understanding that this tax loss selling wasn't likely to abate until the new year was a key trading decision.  This was literally to avoid "catching a falling knife".  Then they announced a fee reduction in the amount of $50mm which is way less than what people predicted and the shares rallied.  I made a gut call that this is a fundamental development.  It doesn't hurt that the shares were bought at $5.25 or 23.4% yield. 

 

After you own a company for 2 years, you tend to have a pulse on the trading ranges.  With HHC, I would buy everytime it hits 50% of my estimated NAV.  One of the mistakes that I made with FRPH in late 2018 is not realizing that the new support is really in the low $40s.  When the shares traded from low $30s to mid $60s, the story is now out.  Gregmal has mentioned that GRIF now has a support in the $40 range.  There are more investors who understand the thesis.  So I probably should not expect the shares to trade to high $20s and low $30s again like it did in late 2018.  Is this technical?  Or is it really knowing 80-90% of the shareholder base and having a vague understanding of the price that they will be willing to buy at. 

 

Another example is with Berry Global.  It's a good business but most investor are very quarterly oriented.  If you pay attention to the questions that other investors ask, they are constantly trying to model out the next quarter.  Volumes are down in 2019.  But it is very easy to grow volume when they are down 6% in the following year especially if you believe that they are temporary in nature.  These are little things that you improve as an investor over time.  I don't have any advice on long term compounders that could run away from you.  You probably shouldn't trade those too much.  But for normal assets where there is a price where you are a buyer and there is a price where you are a seller, it's not a bad idea to keep 70-80% and trade around 20-30% especially in a range bound market.  If anything, it will serve to help you manage risk as the shares trade up 20-25%, it is probably correct to reduce the position size.  Again, these are for normal companies, but not the GOOG or FB of the world. 

 

Look at the world of MMA.  Years ago, it was all "style vs style."  Now wrestlers are strikers and strikers can defend take downs.  If you don't evolve, you get left behind as roadkill.  But the value framework is still the one that I want to stick to just like a wrestler will rely on his grappling.  Frankly, it does make logical sense to pay up for something that could grow 20% topline with operating leverage.  There is a DCF where that makes sense.  But you need to make sure you are very certain of the ability of these companies to improve their returns and expand margins over time. 

Link to comment
Share on other sites

Value investing is hardly the only way to make money. Thats like walking into a garage and declaring a screwdriver the only way to fix things.

 

Many smart guys just prefer it because it acts as a mental valium during turbulent times; thinking they know what the business is "really" worth.

 

I like the screwdriver metaphor.

 

Ed Thorp's autobiography opened my mind up to alternative approaches to investing. So too have people like Ray Dalio. That being said, a value approach resonates with me. It provides a framework for thinking about investment decisions. I think Michael Burry has said that every individual needs to pick an approach that works for their personality type. I couldn't agree more. I'm never going to have the math skills to be a Jim Simons or Ed Thorp. Nor am I going to be a macro trader like Dalio or Druckenmiller. But I do understand market psychology and have the fortitude to go against it if the logic makes sense.

Link to comment
Share on other sites

All investing styles experience an arms race as they are discovered.  Technical / computer model based trading is the same.  It is not easy to beat the market with that approach.

 

One thing I can tell you is that the majority of terms / strategies that people talk about with regards to technical trading will not consistently beat the market. I have put the time in and tested so many of these strategies. It is all garbage. Sometimes they work, sometimes they don't.  I don't doubt they used to work, just not anymore, it is too out there and too easy to do.  If you want alpha you really have to put your time in or else look in some really obscure corner and get a bit lucky.

 

The other thing to consider is that with value investing, at least you know what you own, this matters a lot if the market goes south.  If you are using technical trading and you start taking losses, what do you have to fall back on?  If you buy something based on xyz technical factor and it goes down 30%, now what?  If it's a heavily discounted stock with growing earnings then that is one thing, but if you bought for technical reasons, and the company just reported an earnings drop, what reason do you have to continue to hold?  Of course you can put in stop loss but what if those just keep getting triggered and you get bled out?

 

I would just recommend you take the time to get some software and try out various technical trading strategies.  See how consistently they work.  Get data stretching back to 2007.  What kind of results are you getting throughout the cycle?  Try grouping your stock universe into random clumps of 100-200 stocks and see how consistently the strategy performs across the various clumps.  If it's just for you and you are not some shark trying to sell people on making money out of nothing, you will take the time to really analyze what is happening and across various parameters to rule out any kind of bias.  It is your money after all, nobody to fool but yourself.  If you do this (I did!), you will start to see that nothing really works as well as advertised, or at least that was my experience.

 

This is why I chose my name.  It is this type of nonsense.  Right, of course you just buy when the MACD crosses the 30 day moving average, or whatever.  Unless of course the s&p's MACD has an upslope.  Unless of couse, in retrospect, that doesn't work.  It's noise. 

 

I still think the best you can do is dilligently look for high quality companies and buy when you are comfortable with the likely outcome.  It is still very competitive but if you really look hard, you will find little things from time to time and at least you are buying companies that you like.

Link to comment
Share on other sites

Value investing isn't the only way to make money. Technical stuff works too. Especially if you hire the smartest people you can find, take a scientific approach, have great data sets, super fast market access, an unlimited research budget, excellent risk management and constantly work on improving your game. I think that that is basically what Renaissance is doing. Probably they use financials a well, scan weather reports, track ship movements, whatever. As long as it generates excess returns. I absolutely don't think they do strictly value. In fact I think they don't give a shit what they do - as long as their mathematicians and data scientists think they have found a statistically significant edge. And I absolutely believe you can generate alpha that way.

 

That said, 99.99% of all technical traders you find online are idiots and they lack the data, speed and brains and especially self-awareness to make money. With value investing the percentage of idiots is probably closer to 95%. The good thing is that as long as the stupid value investors buy and hold a few random stocks they generate something that approaches market returns as a group, whereas the stupid technical traders transfer all their savings to their broker and high frequency traders (including Jim Simons, probably).

Link to comment
Share on other sites

Especially if you hire the smartest people you can find, take a scientific approach, have great data sets, super fast market access, an unlimited research budget, excellent risk management and constantly work on improving your game. I think that that is basically what Renaissance is doing.

 

Don't forget another factor - using extreme leverage.  They seem to generate single digit annual returns across their invested capital (90%+ of which is borrowed money and only 10% is their capital).  So lots and lots of extreme leverage.

 

So that's it.  Extreme leverage ... and, uh...basket options. 

 

According to this recent article, Rentech appears to use a scheme that masks short-term trading gains by turning them into long-term capital gains via basket options held by their investment banks.  "Rather than owning securities directly and booking gains and losses from trading activity, RenTech would buy a [bespoke] option from a bank tied to the value of a securities portfolio it held".  Rentech would then direct the bank to buy and sell securities in the portfolio and hold it for a year+.

 

https://www.bloomberg.com/news/articles/2019-11-13/irs-decision-is-bad-omen-for-rentech-tax-dispute-worth-billions

 

wabuffo

Link to comment
Share on other sites

Is this technical?  Or is it really knowing 80-90% of the shareholder base and having a vague understanding of the price that they will be willing to buy at. 

 

Look at the world of MMA.  Years ago, it was all "style vs style."  Now wrestlers are strikers and strikers can defend take downs.  If you don't evolve, you get left behind as roadkill.  But the value framework is still the one that I want to stick to just like a wrestler will rely on his grappling.  Frankly, it does make logical sense to pay up for something that could grow 20% topline with operating leverage.  There is a DCF where that makes sense.  But you need to make sure you are very certain of the ability of these companies to improve their returns and expand margins over time.

 

I think the above is a great little tunnel to truly see both sides of the man vs machine debate. It is absolutely an advantage playing around with small cap companies where you know the location and/or names of most of the shareholders. If there's 10M shares outstanding and theres some funky trading activity, with a few phone calls and emails you can likely figure out whats up and start "timing" your next move. However, I would have to imagine, that there are some pretty good programmers out there that can utilize machine learning to size out the shareholder base to a certain degree of confidence as well. You can then calculate the odds of each shareholders activity and use filings/public appearances/etc to refine this, over time, becoming probably just as efficient(likely way more efficient) as any boots on the ground shareholder could be.

 

However, again, the catch is that often, once patterns emerge and everyone catches on, they stop working. Which is where a good trader/investor again temporarily should have an edge. The machine one would think, relies on past data and trends/pattern recognition to front run the movements. Over time though the sheer volume of data will undoubtedly allow the machine to win. Human error will probably be the difference maker, and somewhat scarier, if the machine realizes it can manipulate price/volume to influence its results- it will. I could tell you how many times, just this week, where I saw huge flushes of volume and large orders lining up directionally, then, out of nowhere, a couple hundred share trade goes through and all of it disappears and the trend reverses.

 

Everyone has an opinion on what works, and what doesnt. The easiest way to tell? Look at your returns...

Of course theres still a large bunch of disgruntles who believe certain types of returns dont count, or $1 made this way or that way is somehow superior, but money talks. Rennaisiance has been the best. Period.

 

When friends/family ask me how/what to invest in, I tell them to take a nominal amount of money and go buy whatever they think will be a good investment. More often then not, regardless of what they buy, they come to the same conclusion, this is just as much a mental game as it is a fundamental one. A lot of people dont have any grasp on that though, but its were the machine programs like Rentech will always have the edge because they can quantify emotional via data and then remove the emotion from it. Which even the best of us will never be able to do 100% of the time.

Link to comment
Share on other sites

Especially if you hire the smartest people you can find, take a scientific approach, have great data sets, super fast market access, an unlimited research budget, excellent risk management and constantly work on improving your game. I think that that is basically what Renaissance is doing.

 

Don't forget another factor - using extreme leverage.  They seem to generate single digit annual returns across their invested capital (90%+ of which is borrowed money and only 10% is their capital).  So lots and lots of extreme leverage.

 

So that's it.  Extreme leverage ... and, uh...basket options. 

 

According to this recent article, Rentech appears to use a scheme that masks short-term trading gains by turning them into long-term capital gains via basket options held by their investment banks.  "Rather than owning securities directly and booking gains and losses from trading activity, RenTech would buy a [bespoke] option from a bank tied to the value of a securities portfolio it held".  Rentech would then direct the bank to buy and sell securities in the portfolio and hold it for a year+.

 

https://www.bloomberg.com/news/articles/2019-11-13/irs-decision-is-bad-omen-for-rentech-tax-dispute-worth-billions

 

wabuffo

 

So if you trade forex which is mostly what Rennissance trades AFAIK, leverage of 40x is relatively normal (but much to high for a firm with so much capital) but 10x leverage isn’t that high.  The reason is the forex market is the deepest market so even if you trade tens of millions of dollars you are in no danger of being gapped by your stop loss not to mention that you won’t move the market. 

Link to comment
Share on other sites

How do they not blow up with extreme leverage? Is it like someone gives them no margin call account or much higher limits ? If so they are lucky. Heck with no calls one can get rich just wait to recover.

But if this is the case they must have a huge draw down from time to time. I guess leverage and tax havens could have made all of us richer faster. I wonder if being a high capital gains Investor with no big drawdown potential isn't a huge handicap. Maybe their advantages should be outlawed )

Link to comment
Share on other sites

How do they not blow up with extreme leverage? Is it like someone gives them no margin call account or much higher limits ? If so they are lucky. Heck with no calls one can get rich just wait to recover.

But if this is the case they must have a huge draw down from time to time. I guess leverage and tax havens could have made all of us richer faster. I wonder if being a high capital gains Investor with no big drawdown potential isn't a huge handicap. Maybe their advantages should be outlawed )

 

I also was surprised by the high leverage employed 7x+ as mentioned in the book. It looks to me that perhaps where they really shined is risk management, as they apparently survived for 30 years now without blowing up. A few times , it was mentioned in the book that they started to lose money because of bugs in their computerized trading system and it took them time to find out the issue l because the code is so complex. Thats a real risk, Imo.

 

FWIW, what these guys do is not value investing. They have no clue about value and the system doesn’t care. There is this funny passage in the book where Mercer explains how they trade Chrysler stock for example,  not knowing that Chrysler has been taking out years ago by Daimler.

 

What they figured out however is how the stock market voting machine likely is going to work in the near term based on statistical signals. Fascinating stuff.

Link to comment
Share on other sites

I also was surprised by the high leverage employed 7x+ as mentioned in the book.

 

Anytime I see a track record with exceptionally high returns over many years - I always assume leverage is involved.  Either outright margin or implicit margin via options.  It appears RenTech used both margin and options (including dubious tax "saving" strategies). 

 

wabuffo

Link to comment
Share on other sites

Somewhat on topic here:

http://news.mit.edu/2019/model-beats-wall-street-forecasts-business-sales-1219

 

Tasked with predicting quarterly earnings of more than 30 companies, the model outperformed the combined estimates of expert Wall Street analysts on 57 percent of predictions. Notably, the analysts had access to any available private or public data and other machine-learning models, while the researchers’ model used a very small dataset of the two data types.

 

In a paper published this week in the Proceedings of ACM Sigmetrics Conference, the researchers describe a model for forecasting financials that uses only anonymized weekly credit card transactions and three-month earning reports.

 

“Alternative data are these weird, proxy signals to help track the underlying financials of a company,” says first author Michael Fleder, a postdoc in the Laboratory for Information and Decision Systems (LIDS). “We asked, ‘Can you combine these noisy signals with quarterly numbers to estimate the true financials of a company at high frequencies?’ Turns out the answer is yes.”

 

Anyone who is interested needs to understand how proxy data is used. Proxy data can be risky depending on the strength of the relationship. If it breaks down, your model breaks down. And it must make sense therefore human judgement is still required. Using credit card data as a proxy for consumer spending seems reasonable. Using this same data as a proxy for R&D spending in a nuclear energy company may not be reasonable.

Link to comment
Share on other sites

Somewhat on topic here:

http://news.mit.edu/2019/model-beats-wall-street-forecasts-business-sales-1219

 

Tasked with predicting quarterly earnings of more than 30 companies, the model outperformed the combined estimates of expert Wall Street analysts on 57 percent of predictions. Notably, the analysts had access to any available private or public data and other machine-learning models, while the researchers’ model used a very small dataset of the two data types.

 

In a paper published this week in the Proceedings of ACM Sigmetrics Conference, the researchers describe a model for forecasting financials that uses only anonymized weekly credit card transactions and three-month earning reports.

 

“Alternative data are these weird, proxy signals to help track the underlying financials of a company,” says first author Michael Fleder, a postdoc in the Laboratory for Information and Decision Systems (LIDS). “We asked, ‘Can you combine these noisy signals with quarterly numbers to estimate the true financials of a company at high frequencies?’ Turns out the answer is yes.”

 

Anyone who is interested needs to understand how proxy data is used. Proxy data can be risky depending on the strength of the relationship. If it breaks down, your model breaks down. And it must make sense therefore human judgement is still required. Using credit card data as a proxy for consumer spending seems reasonable. Using this same data as a proxy for R&D spending in a nuclear energy company may not be reasonable.

 

I was just going to post this.  8)

 

The other interesting part of that article:

 

Counterintuitively, the problem is actually lack of data. Each financial input, such as a quarterly report or weekly credit card total, is only one number. Quarterly reports over two years total only eight data points. Credit card data for, say, every week over the same period is only roughly another 100 “noisy” data points, meaning they contain potentially uninterpretable information.

 

This paragraph shows why it's tough to build models that do fundamental analysis. Even if you take 500 companies in SP500 and take 10 years of financial reports, you have 5K data points. Even if you take quarterlies, you still have only 20K data points. Compare that to image classification datasets that run into million examples.

Since you likely can't build a single model for companies in different sectors (e.g. it's unlikely that E&P and retailer will share characteristics), the data point number drops even more way down.

 

That's one of the reasons quants don't do a lot of fundamental analysis.

 

Value investors tend to think that they have an edge over algos because machines can't handle business fundamentals. It's quite possible that machines can handle business fundamentals just fine - given enough data. If quants can figure out how to get enough data or build models with less data required, look out.  8)

Link to comment
Share on other sites

I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

 

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

 

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

 

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

 

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

Link to comment
Share on other sites

I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

 

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

 

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

 

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

 

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

 

You can model things very close to intuition but you need highly non-linear models that requires a lot of data points.  For trading on info from 10-Q, I can assure you a huge problem is data issues.  I’m not sure what you mean when you say lacking precision versus generalization.  Are you talking about the bias variance tradeoff?  The bias variance trade off is a more nuanced model requires more data to generalize well, so a 500 million data model should generalize very well unless your model complexity is very high. 

 

Additionally idk if they just use deep learning (I’m pretty confident they do use deep learning), but traditional regularization are not used as often any more.  The big thing is fancy data augmentation techniques.  It’s not your parents linear regression with regularization that Rentech is using and the ability to use these models gives them a huge advantage over models that can train on 500 to 5000 data points like regularized regression. 

Link to comment
Share on other sites

I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

 

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

 

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

 

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

 

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

 

I believe we are talking about different things. You are talking stat curve fitting. I am talking DNNs that can model and generalize real world info and deal with high number of factors influencing the corporate results going forward. Curve fitting is the reason why Wall Street analyst predictions are subpar and also why most investors underperform. Most of these expect the future to look like the past - which is what curve fitting is.

 

People who outperform are:

1. People who have higher accuracy model (whether hand built or ML/automatic).

2. People who have longer term predictions than others

 

If the future looks like the past, nobody can outperform simple curve fitting for 1. or 2. So people can outperform only if curve fitting is wrong. Determining that it is wrong can be based on real world knowledge, second order thinking, intuition, whatever. And these can be ML/DNNed if sufficient data were available. And sufficient data here is way larger than what's needed for curve fitting.

 

I am not sure what you are talking about when you say "you can build a model with 500,000,000 datapoints" - no you cannot. There are not enough companies on Earth to have that many datapoints. You can do that for price data, but not for fundamental data like yearly sales/profits/etc. There's a reason people build DNNs based on data that's available daily or even better every (nano/micro/milli)second. But that excludes most fundamental data.

 

* People can also outperform by choosing an area where competition is low and their models don't have to compete with competent curve fitters.

** People and algos can also outperform by exploiting (psychological/emotional/technical/etc.) drawbacks of other actors. I'm not talking about this now though, even though it's a fascinating area on its own.

 

Edit: For fun and clarity, I'll classify how I see some investors:

- Graham cigar butt investing: Mostly expecting future to differ from the past.

- Growth investing: Mostly expecting company to grow longer than others.

- Buffett: higher accuracy model and longer prediction than others.

- Writser  ;) : choose area where competition is low and you don't have to compete with ...

All of the above (may) exploit the drawbacks of other actors:

- Graham cigar butt investing: exploit others giving up on underperforming company.

- Growth investing: exploit others undervaluing the growth company even when growth is known.

- Buffett: exploits the heck of irrationality of other actors.

- Writser  ;) : exploits the behavior of limited set of actors in special situations.

Link to comment
Share on other sites

I read the book recently and have been thinking about the same thing. Clearly more than one way to skin a cat. Good point by someone on here that his returns are fueled by leverage. One way the book lays out their strategy is its flipping a coin with 50.5% odds and betting heavily so that 0.5% pays off.

 

In regards to the comparison of returns with Buffet and others I think they are incorrect (but would like to be held to account in this). From what I’ve read they return money every year, their returns aren’t compounded, they can’t do what they’re doing with a larger capital base if they did their returns would diminish significantly. Then they’d be inline or lower than everyone else’s.

 

Buffet has always said if he had a couple million he could do 50% a year. By returning money every year that is what Renaissance’s doing.

Link to comment
Share on other sites

I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

 

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

 

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

 

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

 

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

 

You can model things very close to intuition but you need highly non-linear models that requires a lot of data points.  For trading on info from 10-Q, I can assure you a huge problem is data issues.  I’m not sure what you mean when you say lacking precision versus generalization.  Are you talking about the bias variance tradeoff?  The bias variance trade off is a more nuanced model requires more data to generalize well, so a 500 million data model should generalize very well unless your model complexity is very high. 

They are using consumer credit data to estimate 10Q reported sales. There are most definitely 100s of millions of data points for consumer  data, market data - we build various types of ML and NN models using this data and they still suffer from overfitting. In some cases (GBM) they actually are your parent's logistic regressions and decision trees...but we still run into the problem of overfitting, particularly with the NNs. We are purposefully using less data (large holdout samples) and other techniques to battle overfitting problem. Where you run into real problems is specific risk stripe data such as ops risk where you only have a handful of incidents over 20 years. Difficult to model using any methodology.

Link to comment
Share on other sites

I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

 

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

 

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

 

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

 

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

 

You can model things very close to intuition but you need highly non-linear models that requires a lot of data points.  For trading on info from 10-Q, I can assure you a huge problem is data issues.  I’m not sure what you mean when you say lacking precision versus generalization.  Are you talking about the bias variance tradeoff?  The bias variance trade off is a more nuanced model requires more data to generalize well, so a 500 million data model should generalize very well unless your model complexity is very high. 

They are using consumer credit data to estimate 10Q reported sales. There are most definitely 100s of millions of data points for consumer  data, market data - we build various types of ML and NN models using this data and they still suffer from overfitting. In some cases (GBM) they actually are your parent's logistic regressions and decision trees...but we still run into the problem of overfitting, particularly with the NNs. We are purposefully using less data (large holdout samples) and other techniques to battle overfitting problem. Where you run into real problems is specific risk stripe data such as ops risk where you only have a handful of incidents over 20 years. Difficult to model using any methodology.

 

No.  Your credit card data has 100million data points but you have only revenue numbers every quarter so that’s your bottleneck.  Even though you have 100 million x values you only have about 500 y values.  That being said you may have more in the cross section but then you need to do something fancy like transfer learning as it’s not so easy to use revenue from one company to predict another.

 

There are a lot of quants still using logistic regression and decision trees, but I think based on the people Rentech hires they do not focus on these things.  Based on the people they hire, the big name quants who focus on building better models (as opposed to finding more interesting factors) focus on deep learning and heavy duty models and so they can’t use fundamental data.  You sound like you are more in the AQR/Tobias Carlisle approach to quantitative investing where you try to find predictive factors.  The Rentech approach is quite different than your approach to quant trading.  And again it’s forex so how predictive are your fundamental indicators going to be outside of occasional tail events. 

Link to comment
Share on other sites

My understanding of what Renaissance does is value investing. The only way to make money is to buy high and sell low. If you want to keep the money you make, buy below fair value and sell higher.

 

They have automated what people like Buffett used to do manually. Scan a huge number of documents, scan every company, and build the ultimate contrarian investor by programming a machine to do it. What you get out of this process of:

 

  scanning every document/piece of data + a machine-trader  =  a super-Buffett.

 

What I see is hedge-funds who cannot build such a program resort to momentum investing. The "quant" funds such as Renaissance, Two Sigma, Millennium lack the momentum stocks in their 13-Fs. Munger is right that investing has become a lot more competitive.

 

Renaissance does statistical arbitrage. They operate in a similar manner to physicists trying to predict behavior or movements in stars, quantum particles, etc. It is many iterations of hypothesis testing and lots of data. It does not look anything like value investing.

Link to comment
Share on other sites

A few comments:

 

Returns

You can’t compare the returns given in the book and reach the conclusion, as Zuckerman does, “that no one in the investment world comes close [to Medallion].”  The returns for the funds listed in the appendix are internal rates of return (and effectively the compounded annual growth rate for Berkshire).  But Medallion’s is an arithmetic return.  For a fund like Medallion that pays out most of its earnings (they’ve distributed 10x their current capital), the arithmetic return will be very different from the compounded return.  Consider that if Simons was indeed able to compound at 66%, the initial $18 million in Medallion would now be worth $120 trillion ($18 million*1.66^31 = $120 trillion).

 

I would actually prefer to have an investment return, say, 15% and reinvest at 15% over one that returned 66% but couldn’t reinvest.  It wouldn’t even be a close call.

 

Leverage

Renaissance isn’t the only firm operating with a lot of leverage in the financial world.  Banks are levered +10x and do fine.  Same with trading houses (Marc Rich + Co, Phibro, etc.), which run +20x.  Even market-makers (Citadel and all the Chicago prop shops) seem to do OK with +30x leverage.  What’s even more remarkable is that they all do it with either short-term funding (commercial paper, letters of credit, repo, etc.) or call loans (margin debt, demand deposits, etc.). 

 

I wouldn’t want to have 30:1 in call loans against soybeans or some other commodity, but a lot of people seem to make it work. 

 

“Is value investing not the only way?”

I don’t understand what the big deal is about Renaissance.  Is it really that surprising someone with a 180 IQ and a background in mathematics is making money trading?  He’s not the first and he won’t be the last. 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...