Jump to content

TSLA - Tesla Motors


Palantir

Recommended Posts

Speaking of blind spots, the following report (released yesterday) may be relevant for the security transition discussion, even if anecdotal.

Summary: A California Apple employee relied too much on Tesla self-driving technology and had a fatal accident while being distracted by a gaming application.

https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

Link to comment
Share on other sites

  • Replies 4.6k
  • Created
  • Last Reply

Top Posters In This Topic

Speaking of blind spots, the following report (released yesterday) may be relevant for the security transition discussion, even if anecdotal.

Summary: A California Apple employee relied too much on Tesla self-driving technology and had a fatal accident while being distracted by a gaming application.

https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

 

So a guy plays a game on his cellphone while driving and we have to come up with ways to blame the car manufacturer? We may never have self driving vehicles if people respond in this way to these events (anyone who is short will love it though). Autopilot advises you must keep your hands on the steering. So do non-autopilot car manufacturers. Same with if you have lane assist and cruise control on.

 

And no, autopilot does not mean the car drives itself without supervision. Just like you wouldn't want to walk into the cockpit of your commercial airline flight and see the pilots playing video games while the plane was on "autopilot". The term is not ambiguous.

 

Yes, anecdotes are the #1 best form of research in this thread. Look back on all the bear arguments in this thread for many more anecdotes over the years. No one really picked up on the story about the Taycan fire, but Tesla anecdotes = click/retweet crack so they get amplified.

Link to comment
Share on other sites

For once I agree with you Dalal, let's look at some factual evidence instead of anecdotes: https://venturebeat.com/2020/02/26/california-dmv-releases-latest-batch-of-autonomous-vehicle-disengagement-reports/

 

Waymo’s 153 cars and 268 drivers covered 1.45 million miles in California in 2019, eclipsing the company’s 1.2 million miles in 2018, 352,000 miles in 2017, and 635,868 miles in 2016.

 

Tesla reported zero miles driven autonomously on public roads in California during all of 2019, as it has for the past three years.The company says that it conducts its testing via simulation, in laboratories, on test tracks, and on public roads in various locations around the world, and that it “shadow-tests” its cars’ autonomous capabilities by collecting anonymized data from over 400,000 customer-owned vehicles “during normal driving operations.”
Link to comment
Share on other sites

Speaking of blind spots, the following report (released yesterday) may be relevant for the security transition discussion, even if anecdotal.

Summary: A California Apple employee relied too much on Tesla self-driving technology and had a fatal accident while being distracted by a gaming application.

https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

 

So a guy plays a game on his cellphone while driving and we have to come up with ways to blame the car manufacturer? We may never have self driving vehicles if people respond in this way to these events (anyone who is short will love it though). Autopilot advises you must keep your hands on the steering. So do non-autopilot car manufacturers. Same with if you have lane assist and cruise control on.

 

And no, autopilot does not mean the car drives itself without supervision. Just like you wouldn't want to walk into the cockpit of your commercial airline flight and see the pilots playing video games while the plane was on "autopilot". The term is not ambiguous.

 

Yes, anecdotes are the #1 best form of research in this thread. Look back on all the bear arguments in this thread for many more anecdotes over the years. No one really picked up on the story about the Taycan fire, but Tesla anecdotes = click/retweet crack so they get amplified.

 

Do you get this hard for your significant other? Or is Elon the only one that does it for you.....?

 

Honda civics 2015 models had a camera for the passenger side mirror which was used to check blind spots. They discontinued it because they found it caused too many distractions and caused drivers to not actually turn and look themselves. I find it ironic that you tout the magic of autonomous vehicles due to their ability or future ability to mitigate human error. Yet when that same function “encourages” a human error it’s insignificant.

 

I don’t think it’s out of the question to discuss the relationship and potential impact of autonomous systems might have on human decision making.

Link to comment
Share on other sites

For once I agree with you Dalal, let's look at some factual evidence instead of anecdotes: https://venturebeat.com/2020/02/26/california-dmv-releases-latest-batch-of-autonomous-vehicle-disengagement-reports/

 

Waymo’s 153 cars and 268 drivers covered 1.45 million miles in California in 2019, eclipsing the company’s 1.2 million miles in 2018, 352,000 miles in 2017, and 635,868 miles in 2016.

 

Tesla reported zero miles driven autonomously on public roads in California during all of 2019, as it has for the past three years.The company says that it conducts its testing via simulation, in laboratories, on test tracks, and on public roads in various locations around the world, and that it “shadow-tests” its cars’ autonomous capabilities by collecting anonymized data from over 400,000 customer-owned vehicles “during normal driving operations.”

 

Lol. Incapable of sifting signal from noise. A terrible way to practice investing, a suicidal way to practice short selling. Keep doing what you're doing.

Link to comment
Share on other sites

Speaking of blind spots, the following report (released yesterday) may be relevant for the security transition discussion, even if anecdotal.

Summary: A California Apple employee relied too much on Tesla self-driving technology and had a fatal accident while being distracted by a gaming application.

https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

 

So a guy plays a game on his cellphone while driving and we have to come up with ways to blame the car manufacturer? We may never have self driving vehicles if people respond in this way to these events (anyone who is short will love it though). Autopilot advises you must keep your hands on the steering. So do non-autopilot car manufacturers. Same with if you have lane assist and cruise control on.

 

And no, autopilot does not mean the car drives itself without supervision. Just like you wouldn't want to walk into the cockpit of your commercial airline flight and see the pilots playing video games while the plane was on "autopilot". The term is not ambiguous.

 

Yes, anecdotes are the #1 best form of research in this thread. Look back on all the bear arguments in this thread for many more anecdotes over the years. No one really picked up on the story about the Taycan fire, but Tesla anecdotes = click/retweet crack so they get amplified.

 

Do you get this hard for your significant other? Or is Elon the only one that does it for you.....?

 

Honda civics 2015 models had a camera for the passenger side mirror which was used to check blind spots. They discontinued it because they found it caused too many distractions and caused drivers to not actually turn and look themselves. I find it ironic that you tout the magic of autonomous vehicles due to their ability or future ability to mitigate human error. Yet when that same function “encourages” a human error it’s insignificant.

 

I don’t think it’s out of the question to discuss the relationship and potential impact of autonomous systems might have on human decision making.

 

What is your track record of all the posts you've made on this name? Not impressive. Maybe you could devote the failed opportunity costs to actually acquiring a significant other.

 

Autopilot is not an autonomous vehicle just like a 747 with autopilot is not an autonomous airplane. It does not "encourage" the pilot to fall asleep. An argument can be made that cruise control "encourages" human error. So do cell phones, but we are not willing to give them up. Doesn't mean AAPL is to be held accountable if a teen is texting while driving. But I guess in your world, accountability doesn't matter.

 

Yes, the guy who launched a rocket company and successful EV company simultaneous is unimpressive.

Link to comment
Share on other sites

Speaking of blind spots, the following report (released yesterday) may be relevant for the security transition discussion, even if anecdotal.

Summary: A California Apple employee relied too much on Tesla self-driving technology and had a fatal accident while being distracted by a gaming application.

https://www.ntsb.gov/news/events/Documents/2020-HWY18FH011-BMG-abstract.pdf

 

So a guy plays a game on his cellphone while driving and we have to come up with ways to blame the car manufacturer? We may never have self driving vehicles if people respond in this way to these events (anyone who is short will love it though). Autopilot advises you must keep your hands on the steering. So do non-autopilot car manufacturers. Same with if you have lane assist and cruise control on.

 

And no, autopilot does not mean the car drives itself without supervision. Just like you wouldn't want to walk into the cockpit of your commercial airline flight and see the pilots playing video games while the plane was on "autopilot". The term is not ambiguous.

 

Yes, anecdotes are the #1 best form of research in this thread. Look back on all the bear arguments in this thread for many more anecdotes over the years. No one really picked up on the story about the Taycan fire, but Tesla anecdotes = click/retweet crack so they get amplified.

 

Do you get this hard for your significant other? Or is Elon the only one that does it for you.....?

 

Honda civics 2015 models had a camera for the passenger side mirror which was used to check blind spots. They discontinued it because they found it caused too many distractions and caused drivers to not actually turn and look themselves. I find it ironic that you tout the magic of autonomous vehicles due to their ability or future ability to mitigate human error. Yet when that same function “encourages” a human error it’s insignificant.

 

I don’t think it’s out of the question to discuss the relationship and potential impact of autonomous systems might have on human decision making.

 

What is your track record of all the posts you've made on this name? Not impressive. Maybe you could devote the failed opportunity costs to actually acquiring a significant other.

 

Autopilot is not an autonomous vehicle just like a 747 with autopilot is not an autonomous airplane. It does not "encourage" the pilot to fall asleep. An argument can be made that cruise control "encourages" human error. So do cell phones, but we are not willing to give them up. Doesn't mean AAPL is to be held accountable if a teen is texting while driving. But I guess in your world, accountability doesn't matter.

 

Yes, the guy who launched a rocket company and successful EV company simultaneous is unimpressive.

 

The difference is I don’t claim to be a stock savant. And I’m not running down individuals who have taken time to approach Tesla’s thesis from different angles. Apparently you have it all figured out.... ::)

Link to comment
Share on other sites

The difference is I don’t claim to be a stock savant. And I’m not running down individuals who have taken time to approach Tesla’s thesis from different angles. Apparently you have it all figured out.... ::)

 

Lol. Look back on my posts. I was ridiculed in mid 2019 here by the consensus when this thing was in the low 200s...never claimed to be a savant. All I said was it ain't easy being a contrarian. Then I left for about 5-6 mo while this thing rocketed.

 

Well, I was right, being a contrarian is not easy, but it sure did pay well.

Link to comment
Share on other sites

I drove 45 miles today, returning to my starting point having expended 12.2 kWh. 

 

That's a 313 mile range, based on a new 85 kWh battery.  The car is rated for 265 miles of range when new.

 

I recently read that regenerative braking captures only about 60% of the energy, so I tried to drive without using any braking at all where possible.  It made a big difference.

 

I had the climate control disabled, and the sunroof open.  It was close to 80 today.

Link to comment
Share on other sites

As someone who has a Class A CDL and about 3k miles in road time (not really lot) stay the hell away from semis. Keeping a 8.5ft wide truck in an 9-12ft wide lane is difficult enough on a perfect day. Throw in some crosswinds, inclement weather and it's a whole different ball game. Especially in construction zones. I see so many morons driving right next to them on lane merges with barriers on either side. And remember if your riding their ass they can't see you. Rear blind spot is something like 30-50ft depending on trailer.

 

And if you are in the truckers rear blind spot then you are blind as well.  You can't see what happens in front of the truck and you won't know if he is about to slam on his breaks until after he does so, add in your reaction time and a crash is hard to avoid.

 

More reasons why we need self-driving trucks with 360-degree cameras, ultrasonics, radars, and V2V coms.

 

At the very least driver assist technology. Making turns with a 53 foot trailer and having to make adjustments (possibly stop and reverse) mid turn due to other vehicles, pedestrians, signs, etc seems way too difficult.

I think these trucks have a pretty nifty driver assist system.

 

 

And yea... don't try this at home.

Link to comment
Share on other sites

I drove 45 miles today, returning to my starting point having expended 12.2 kWh. 

 

That's a 313 mile range, based on a new 85 kWh battery.  The car is rated for 265 miles of range when new.

 

I recently read that regenerative braking captures only about 60% of the energy, so I tried to drive without using any braking at all where possible.  It made a big difference.

 

I had the climate control disabled, and the sunroof open.  It was close to 80 today.

 

It's strange how there are many Tesla owners on this website who have been enjoying the product ("best car I've ever owned") while at the same time, claims of "commoditization", impending "Tesla killers", and a lack of a moat abound (from others). Seems like an underlying disconnect between consumers & investors.

Link to comment
Share on other sites

Good video on the potential of Tesla insurance:

That was interesting and thank you.

 

The positives:

-The co-selling of the insurance product as an "add-on" could be a marketing tool and a source of profit.

-The safety technology and autonomy aspect will likely change the pattern of accidents and result, on a net basis, in a lower risk profile so the insurance product will need to adapt and Tesla may be well positioned as a first-mover for their cars.

-Perhaps the higher insurance quotes that people were getting (insurance industry can be conservative with new products) was a negative factor for some buyers, so offering the insurance product may help sales in the interim.

 

The negatives:

-The video is promotional and shows lack of fundamental understanding related to underwriting.

-They seem to imply that, even without the security features and autonomy aspect, that Tesla cars are safer (or is it the drivers?). Why would that be? (around 4:34 in video)

-Even if Tesla is able to individually price better the policies (in theory), in the aggregate, they are unlikely to do better than the industry as a whole once the period of adaptation is over. Also, offering a lower-priced product overall is unlikely to produce larger profits on a large and sustainable scale.

-The auto insurance industry is very good at capturing the essence of risk with very simple procedures. For example, if lower premiums are warranted, they may rapidly find out that most of the refinement may be captured simply by the age group, occupation, zip code or some other socio-economic indicator.

-The video seems to imply that the float harvested from this venture could be reinvested into other ventures. Tesla has a fronting arrangement with an insurer that does not build float as the risk (and the float) is transferred to a third party. Even if Tesla would be the third party, rules and regulations are very strict about where funds can be invested and regulators would be particularly vigilant with a new product.

-It is possible that Tesla becomes an insurance behemoth (car, home etc) but this is a field completely unrelated to car manufacturing. I don't see anything inherently clear as to why Tesla would integrate technology better than the industry as a whole for the underwriting part.

 

Edit: The eventual addition of recorded technology features raises very difficult questions for issues of privacy and liability (the questioning is already underway).

Link to comment
Share on other sites

The whole idea of Tesla insurance being cheaper than other insurers is hilarious.

 

But I'll put my Tesla bear cap of and reasonableness pants on to elaborate.

 

Are Tesla drivers a new bread or are these people the same people that drove Audi's, Mercs and beemers before driving Tesla's?

Why does this matter? Because insurance companies have 10's of years of statistics of how prone people are to having accidents.

 

Since these people are the same people that drove Audi's, Mercs and beemers before driving Tesla's, do you think that once they start driving Tesla's they become safer drivers or less safe drivers? I'm in the camp that they become less safe drivers, not because of the fact that they drive Tesla's, but because of the fact that they started driving a new type of car (an electric one, that at the same time accelerates faster than a regular car). I for example am used to driving a car with a manual gear. I don't think I will start driving more safely changing to an automatic gear, or driving a US car instead of an EU one (sitting on the right hand side of the car). Similarly, I don't think people become more safe drivers driving a Tesla than driving their previous car. These cases of sudden unintended acceleration? If it is not Tesla's fault, than it's the drivers fault..

 

So now we now that the drivers are probably more prone to having accidents. At the same time, other drivers might become less safe drivers with electric cars around because (1) electric cars are more silent than ICE (2) electric cars accelerate faster than ICE cars.

 

So we know the reason why electric car drivers need to pay higher insurance premiums than ICE car drivers (for now, because I presume that once people have more experience driving electric cars, the rate of accidents will drop, and hence insurance premiums).

 

So why is Tesla(s insurance carrier) able to offer lower insurance premiums than other insurers? I don't think it is because Tesla has better information on the accident rate than other insurance companies, but Tesla simply started with this business line because it suits into the narrative of "Tesla's are safe cars".

 

I believe that the agreement between Tesla and its insurance carrier is as follows:

- Tesla gets a commission from its carrier for every driver that takes on a Tesla insurance (like it is the case in the normal agent/ broker relationship)

- Tesla gets a rebate (bonus) from its carrier in case the actual accident rate (payout to insured Tesla drivers) is below the expected accident rate

- Tesla needs to pay its carrier in case the actual accident rate (payout to insured Tesla drivers) is above the expected accident rate

 

 

 

 

Link to comment
Share on other sites

...

I believe that the agreement between Tesla and its insurance carrier is as follows:

- Tesla gets a commission from its carrier for every driver that takes on a Tesla insurance (like it is the case in the normal agent/ broker relationship)

- Tesla gets a rebate (bonus) from its carrier in case the actual accident rate (payout to insured Tesla drivers) is below the expected accident rate

- Tesla needs to pay its carrier in case the actual accident rate (payout to insured Tesla drivers) is above the expected accident rate

The following is based on various reasonable assumptions and inferences and there could be some kind of risk retention by Tesla but I think that, in this specific case, State National (Markel sub), as a fronting agent which is recognized, licensed and rated acts as a plumbing agent to connect alternative capital providers looking for uncorrelated returns such as primary insurance lines and Tesla which is looking for third-party capital to take over the risk from issuing insurance policies. This is reminiscent of the rise in alternative capital presence in the insurance-linked securities market. The experience may be different but, for the ILS market, the rise in interest for the returns was associated with policies issued too cheaply and things have started to normalize.

If interested, the short video can be instructive (especially starting at 2:12 for the Tesla relevance):

https://www.statenational.com/fronting/fronting-explainer/

 

Link to comment
Share on other sites

 

I'm not impressed by GM:  400 mile of range with a 200 kWh battery and the car isn't even in production.

 

Compare that to the 390 miles of range that Tesla currently gets on production Model S with a 100 kWh battery, a battery of only 1/2 the size of GM's.

 

Tesla's upcoming Roadster has a range of 620 miles, and the largest Cybertruck configuration will be 500 miles.

 

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...