There has been at trend by large companies these last few years to use buzz phrases so often they have become common vernacular in the workplace. Words that I believe dilute the foundation of corporate hierarchy.

I have noticed a trend to move away from the word management and replace it with leadership. In my opinion this is the most offensive shift. Managers are appointed or hired. Leaders rise to the occasion regardless of their job title. To extend to a person the role of leadership by a simple promotion dilutes the word to its minimum.

Employees are now called team members or associates. This is to instill a sense of ownership and participation among employees when in fact none exists. Most people I know that work at these large box companies find the entire situation demeaning. They would rather see more autonomy in their career path, becoming more productive and enjoying better job satisfaction.

In my industry I’ve seen phrases such as best face forward, best practices and client focused used repeatedly. At face value these sound like sound business principals, but who defines these roles? Managers? Leaders? I have witnessed people with absolutely no sales experience write the manual for best sales practices without input from active producers in the field.

Words like diversity are given the best glossy advertisement treatment and are largely ignored at the local level. Harassment free workplace is posed on every bulletin board and elevator, but individual complaints are ignored or covered up.

So here is my question. Who are we impressing with these campaigns? Let managers manage and leaders lead. Let producers sell and service personnel service. Get rid of any layer of management that is business prevention and replace them with outcome based individuals. Oops, couldn’t help myself.

I have to say if I never see the word synergy again, I will be a happy man.