110V and 220V are outdated nominal voltages. Many Model S owners may be seeing the measured voltage at their home (as measured by the car) for the first time. It grates on me when I continue to see references to 110V or 220V. Maybe I’m the Brian H of the voltage world.
If you actually measure 110V or 220V you have a system with a problem. Neither of those voltages should ever occur in the US. US residential home voltage is nominal 120V +/- 5%. The national average delivered is about 118V. The higher voltage is 240V if you’re in a two-wire feed system or 208V if you’re in a three-phase system also +/-5%. Most higher-voltage devices will be labeled something like 208-240V because they need to work on both voltage systems. For an electric motor that is striving to maintain a fixed RPM, if the voltage is the lower one, it just tries to pull more amps to get the same power and generally succeeds, up to a point. The Tesla charger tries to get the nominal amperage, let’s say 40A for a standard single charger on a NEMA 14-50. If this is connected to a 240V system it will yield a nominal 9.6 kW. If it’s on a 208 system, it will yield a nominal 8.3 kW, 14% less power and 14% less miles per hour of charge. It will not act like a motor and will not try to pull the 46A necessary to achieve the same 9.6 kW while on a 208V system. In reality with voltage drops that WILL occur from the main panel to the car, voltages may be down and additional few percent from the nominal when run at 40A - 80% of rated amperage.
In my own case, my garage has a subpanel fed from a 50A breaker in the house. I installed a NEMA 14-50 fed from the subpanel on a 50A breaker as specified. Because of either other trivial loads in the garage panel that I think are turned off or a weak, 30 year-old feed breaker in the house, if I pull 40A to the car it will trip the breaker in the house that feeds the garage in about an hour. I have no problem with the car set to 35 amps and I always get a charge overnight. My measured voltage with no load in the car is 241V. At 40A charge it is 233V. I think that’s a fairly normal drop considering it’s from a sub-panel and running through a UMC cabled that gets warmer than it really should (see the thermal photos posted on another string; they all do that). Someday maybe I’ll measure the voltage at all the intermediate points but for now, 35A works fine.
The power company substations have to deal with loads that change during the day and the resulting voltage drop from inherent resistance in their distribution system. When loads are higher, their transformers have “tap changers” that adjust the voltage up or down to compensate for loses in their distribution system. Sometimes they don’t work. I had a case a few years ago at the university where I was responsible for computer power, about 30% of the office UPSs were reporting being online to lower the voltage. The supplied voltage was 128-129 volts. After several emails to the campus power folks, the reply came back that they had discovered that one of the substation transformers had been incorrectly set in “Manual” for voltage control. So much for nominal voltage provision.
If you’re having issues, you need to measure the voltage at no-load and at full-load. If it’s out of spec (+/- 5% for the respective system voltage) you need to contact an electrician or your power company.