What's new
What's new

Is compensating for tool wear a "thing"?

Conrad Hoffman

Diamond
Joined
May 10, 2009
Location
Canandaigua, NY, USA
Not a CNC guy here, so maybe a dumb question. Does one compensate for tool wear? I don't really grasp this because it seems if a tool is worn enough to need compensation, it's what I'd call dull, and in need of replacement. Seems like the finish would be going bad. How does it work in the "real" world?
 
It depends on your tolerances. +/-.005" and you don't need to but much less than that and yes. For me, always! Tools will still leave a nice finish but need .0015"-.002" of comp with 3/8" or smaller tools, and consistently hold a tenth or two on part size.
 
Definitely a thing for sure, we use it daily and have used it for a good 30 years.
You can make small adjustments that can either account for tool pressure, or the tool just not being as sharp as it was when it was brand new.
 
Definitely a thing for sure, we use it daily and have used it for a good 30 years.
You can make small adjustments that can either account for tool pressure, or the tool just not being as sharp as it was when it was brand new.

I also use it when I have a tool that's been sharpened and is now undersize, running in a existing program I don't feel like going into the cam to change
 
Yes, it is a normal part of running a CNC machine--especially a CNC lathe. The tool wears, you comp the offset. As the machine warms up, you will need to adjust the offsets to account for machine thermal growth.
 
It's also important for using indexable cutters on lathes. You run a series of 0.375 parts. Maybe you start at 0.374 with a fresh insert and at 0.375 or 0.376 you start taking out wear comps. When the tool is finally too dull, you rotate the insert to a fresh edge and either zero the wear offset or even add some positive offset for the first cut to see where that edge actually is. You keep those changes in the wear column so you never lose track of where the tool is supposed to be nominally.
 
Absolutely a thing. We used to get these 1 in TiN coated cob endmills from overseas. Thats all we knew is they came from overseas. They was as tough as a 2 dollar steak and would just cut and cut, but the size was all over the place. Might be 1.0 or it might be 1.005 or .995. You just never knew so we had to mic and comp them and then let em eat.
 
Most endmills I get are about a thou undersized, so I comp that out. Comp also accounts for cutter and part deflection, as previously mentioned, which will change as the cutter dulls. Since I try to keep my profiles within .0005", even when tolerances are higher (it makes subsequent setups easier and more accurate), I comp everything that makes a finish pass.
 
I could be wrong but I feel like it has more to do with deflection from increased cutting pressure rather than the cutting edge being that much further away from the surface of your part. Like you may have to offset a 3/8” end mill .002” to compensate for only .0005” of measurable wear.
 
Turning brass bushes right now for the sights I make. Tolerance is +0 -.01mm so I check and adjust comp on the lathe periodically as it runs unattended while I am doing other work.
 
Many, many years ago one of manual guys thought he'd like to help me out on the CNC lathe I ran. He was a third shifter, and all he could do was run whatever part I had setup. He couldn't grasp how to adjust the tool offset, so he'd change to a new insert whenever the gage got tight or the OD ran heavy. After a few weeks, he decided to go back to a manual lathe.

I mostly used a top notch NTP4 insert for threading and could usually adjust the offset about .007" total over many parts before the insert was worn enough for replacement.
 
Most endmills I get are about a thou undersized, so I comp that out. Comp also accounts for cutter and part deflection, as previously mentioned, which will change as the cutter dulls. Since I try to keep my profiles within .0005", even when tolerances are higher (it makes subsequent setups easier and more accurate), I comp everything that makes a finish pass.
Was going to say the same thing. A 3/8" endmill starts with a 3/8 on size piece of carbide. After grinding, almost any endmill is going to be a little bit undersized. For instance, Helical states their endmills are +.000"/-.002"
 
We use a lot of reground and some custom tooling (OE production manufacturing) so everything is comped. Not a lot of super tight tolerance so it's not always critical but making it standard practice helps keep from overlooking something
 
Not a CNC guy here, so maybe a dumb question. Does one compensate for tool wear? I don't really grasp this because it seems if a tool is worn enough to need compensation, it's what I'd call dull, and in need of replacement. Seems like the finish would be going bad. How does it work in the "real" world?
Tool compensation is more than an easy adjustment for the tool wear. Think in terms of the part size needed also. If I need a somewhat looser press fit than what the result is, I can make a .0001 adjustment very easily. This is much faster than changing the program or the tool.
I will also add that I write my programs to the part size minus the cutter radius (milling) so that my tool offset is set at zero. In Mastercam, it is referred to as WEAR COMP. This results in very minimal lead-in/out distances. When a cut is in a very small space for leading in/out, you will need that since many machines will not allow a vertical move and maintain compensation at the same time. For instance, "wear comp" will allow a lead-in of .001, whereas "cutter comp" will require a lead-in of at least the radius of the cutter. The program may check out fine, but the machine may give an error alarm.
 
I will also add that I write my programs to the part size minus the cutter radius (milling) so that my tool offset is set at zero. In Mastercam, it is referred to as WEAR COMP. This results in very minimal lead-in/out distances. .. For instance, "wear comp" will allow a lead-in of .001, whereas "cutter comp" will require a lead-in of at least the radius of the cutter. The program may check out fine, but the machine may give an error alarm.
You've been speaking too much japanese. This is all historically horsecrap.

There is CL programming (some people say centerline, some people say cutterline) and there was part surface programming. And there is no such thing as "wear comp" and "cutter comp" it's all cutter comp.

Typical japanese excessive pointless complication that serves no purpose except to make things more confusing and inefficient. And weird, too. It's effing dumb.

I sometimes wish that fanuc had been born dead but on the other hand, we the US fucked ourselves and deserve what we get -- advertising and selling each other life insurance as the country's major products, yippee. So most likely this crap is the best we'll ever have :(
 
I think tolerable tool wear varies. A sharp tool might improve surface finish with a .003 wear land so a .003 wear adjustment may be favorable to get the most tool usage. A cutter for a cast iron part may allow a .010 loss of size and a .030 wear land or more might be best for extending tool life.
The same with a grinding wheel, often a wheel gets better with a little wear.
I'm not a CNC guy net in some cases taking an extra .001/.002 after 100 or 1000 parts might be understandable.
 








 
Back
Top