What's new
What's new

VFD adjustments when using single phase input.

rons

Diamond
Joined
Mar 5, 2009
Location
California, USA
It's popular to change frequency to mask noise. How about performance when the drive/motor are already de-rated.

Example:
7.5 Hp motor powered by VFD on single phase. Motor requires 23 A on name plate. The de-rate factor is going to ask the VFD for 23 x 1.73 = 39.7 A.
If the VFD switching frequency range is 2,3,4,5,6,7,8,9,10 Khz, what would be a optimum number. In the sense to increase any performance factor up.
The factory default is 4 KHz.
 
No real motor-side performance issues, a little heating, perhaps at higher frequencies.

The VFD-side issues are basically heating also, but more heating in approximate proportion to the increase in frequency, since each "switching event" has a similar energy loss (at any given current).

Use the lowest frequency you can tolerate. That will provide the least heating.
 
I had a "Decel Inhibit" fault when PWM frequency was 4 KHz. Upped it to 8 Khz. No problem yet but I'm not sure yet if that did anything.
- Decel Inhibit : Drive is not following a commanded acceleration because it is attempting to limit bus voltage.

I was also thinking of changing the parameter from sensor-less vector to a fan/pump V/Hz.

What about connecting a sizeable oil filled capacitor between L1 and L3 (The unused input). Or between L2 and L3?

Uncoupling the motor from the load cannot be done easily in my case. I would consider a heavy fly-wheel to be part of the load and not part of the motor.
I did the rotational test with the fly-wheel installed and it probably makes some kind of difference.

I was also wondering if the rotational sensing test has to be performed each time a PMW frequency change or PWM mode change is made.
 
Last edited:
Using a capacitor will allow a current injection into the charging section, in an interval that would not be active. Ninety degree phase shift is but better than nothing.

A jumper shares the current but I don't think you gain any advantage.
 
Limiting DC bus voltage means you have inadequate energy storage or dissipation in the DC bus for the braking you're trying to do. Normal VFD deceleration regenerates energy from the motor back into the DC link, but generally cannot push it back into the AC grid.

There are a few options:

  • Decelerate slower, or allow the motor to coast to a stop.
  • Use a braking resistor to dump the energy into.
  • Fit more capacitors to the DC bus, so that there is more energy storage.
  • In some very specific circumstances, manufacturers will let you common up the DC bus of multiple drives, allowing them to share regenerated energy,
  • Use DC injection braking, which dumps the heat into the motor windings. It's less controllable though.

This behaviour isn't really related to using single phase; if anything using an oversized VFD means you have a bigger DC bus to dump the energy into.
 
What about connecting a sizeable oil filled capacitor between L1 and L3 (The unused input). Or between L2 and L3?
....

I suspect if this was an effective fix, the manufacturer would have mentioned it in the manual, or better yet included it in the design.
 
The phase shift is load-dependent. A rectifier is a very non-linear load.

Bottom line is that you almost certainly do better with a plain jumper.
 
Incoming on L1 & L2 230 Vac. A jumper from L2 to L3 will strap their rectifiers in parallel. The charging will be the same.

A phase-shifted charge pulse would be desirable in a single phase arrangement where the charge pulses occur 180 degrees apart.
 
Desirable, not necessarily achievable without as much trouble as getting three phase to begin with.

The highly variable impedance makes the phase shift pretty inconsistent.
 
Incoming on L1 & L2 230 Vac. A jumper from L2 to L3 will strap their rectifiers in parallel. The charging will be the same.

A phase-shifted charge pulse would be desirable in a single phase arrangement where the charge pulses occur 180 degrees apart.

Patent the idea and sell it to VFD manufacturers.
 
Already done... they bought it up and suppressed it, just like those 100 mpg carburetors.
 
The test would be to up-size a load and catch a fault.

Then see if the fault goes away with a single tie.
Or a hefty cap tie. The caps would assemble into another Teco case. Therefore they would not lose any money, and they can steal my idea right now.
No patents, no attorneys, just a bunch of ChoComms who keep putting us in dept.
 
L2 and L3 tied together just means L1 diodes fail first.

A capacitor from L2 to L3 likewise does nothing.

An auto transformer in series with capacitor (to push volts through it) would help.
But it's an expensive solution compared to just using larger rectifier diodes and capacitors.
 
L2 and L3 tied together just means L1 diodes fail first.

A capacitor from L2 to L3 likewise does nothing.

An auto transformer in series with capacitor (to push volts through it) would help.
But it's an expensive solution compared to just using larger rectifier diodes and capacitors.

I measure an increase in current with a 50uf cap from L2 to L3. The reading doubled in the under 2A range.
Now if I could use a delay line long enough for a 8ms delay, that would be about right
 
It will put in more current, as it ends up acting as a voltage doubler.

That may not be good though. You could easily end up going over the maximum bus voltage at low loads.

Overall, no real advantage, and a definite disadvantage, as the bus voltage goes up over the peak line voltage.
 
...
If the VFD switching frequency range is 2,3,4,5,6,7,8,9,10 Khz, what would be a optimum number. In the sense to increase any performance factor up.
The factory default is 4 KHz.
Switching frequency (aka Carrier Frequency) has little to no effect on the line side conversion aspect. What it affects are the switching losses in the output transistors; more On-Off transitions per mS, more losses. So generally the VFD mfr will provide you with a chart on de-rating the drive as the CF increases. As a gross generalization though, the de-rate is generally 4% for every 1kHz increase in the CF above whatever the drive has as a default setting (the basis of the drive rating). So for example if your drive rating is based on 4kHz and you turn the CF up to 10kHz, you would de-rate the drive by 24% ((10-4) x 4).

However if you have ALREADY de-rated the drive because of the single phase input, you usually do NOT need to further de-rate it, because the output de-rating is already covered in the input de-rating.
 








 
Back
Top