Compensation to The ADEA
|61||TheEvolution of Compensation|
Until the early 1980s, compensation wasn’t linked in any way to performance. It was a paternalistic process that had little to do with business strategy, says Michael Thompson, national director of reward consulting for the Hay Group. Base salaries were standard, few workers received incentives for performance, and pay was based largely on economy and seniority.
In the last two decades,
compensation transitioned from an administrative task to a complex balancingact between driving competitive advantage
and managing workers’ needs.
That began to change with the high inflation rates of the 1980s, Thompson says. “Organizations recognized that competitive advantage was not just dictated by access to capital or technology, but by people and talent.”
In the last two decades, compensation transitioned from an administrative task to a complex balancing act between driving competitive advantage and managing workers’ needs. Individual performance plays a larger role in determining compensation, and performance reviews are now a key component of benchmarking the value of workers.
But individual performance isn’t the only factor that determines worth in today’s workplace, Thompson says. To get a true measure of value, managers also evaluate the impact of a worker’s performance on the overall success of the company. The kind of performance that matters the most determines which jobs receive the greatest compensation. “That’s what defines the value of a worker,” he says.
At one time, HR grappled with privacy issues in the form of locked desks and locker searches. Now, privacy issues are high-tech. Given the rising threat of terrorism, the theft of intellectual property, and workplace fraud, many organizations are desperately searching for ways to combat criminal risks. With growing frequency, they’re turning to sophisticated surveillance techniques that allow the employer to monitor workers via video, computer keystrokes, Web page visits, and physical movements within a building.
Despite an outcry from privacy advocates, courts have consistently upheld the right of employers to monitor workers.
For HR, maintaining security while avoiding an Orwellian workplace is no simple task. “Personal information collected legally through e-mail and surveillance allows a boss or someone else to further their agenda,” observes Simson Garfinkel, author of Database Nation: The Death of Privacy in the 21stCentury.
Personal information collected in the workplace, such as Social Security numbers and home phone numbers, is finding its way into the outside world. Conversely, information from the outside, such as medical records and genetic data, is filtering into the workplace, creating new privacy risks. Organizations are searching for new ways of dealing with legitimate security risks while protecting private information.
For everything that HR wants to know, there’s a test: Personality. Honesty. Interests. Skills.
During the industrial revolution, testing was more about fitting people into processes already established. Now, tests are used to figure out things like how a person can contribute to a business, and whether a person fits into a company’s culture.
“Employers are now using tests to develop individual plans for maximizing employee satisfaction, increasing retention, and helping in the organization’s strategic objectives,” says Charlie Wonderlic, whose grandfather, Al Wonderlic, was testing for cognitive ability in 1937.
Résumés and interviews weed out definitely unqualified job candidates, but tests are thought to provide much more information. Many tests, however, weren’t developed for job performance in the first place. “There has been a proliferation of bad tests,” says Dr. Wendell Williams, an industrial psychologist with ScientificSelection.com, which develops selection tools.
Williams says research shows that the best way to predict an employee’s potential is by measuring intelligence, particularly as it relates to solving problems similar to those that could occur in a business. This can be done, of course, with a test.
Abraham Maslow did most of his important research in humanistic psychology in the 1950s, while chair of the psychology department at Brandeis University. It was there that he created his “hierarchy of needs,” determining that low-level needs must be satisfied before higher-level needs can be met. Maslow looked beyond the basics-air, water, food-and added five others in this order: physiological, safety, love and belonging, esteem, and self-actualization.
The last level in his hierarchy-self-actualization-pertains to how a person fulfills her potential. It was a sign of the times; his work came into vogue during the 1960s, when people were looking for more meaning and purpose in their lives.
Not everyone buys the hierarchy as a workplace motivator. John Boudreau, a professor of HR at Cornell University, cites the starving-artist syndrome, in which “a person would rather pursue their art than eat.”
Virtually everything that human resources managers do is done differently in a global environment. “HR has to take into account currency differentials and cultural differences that affect how you pay and the way people are paid, according to their class,” says John Boudreau, a professor of HR at Cornell University.
Until 1970, globalization meant direct foreign investment going from the United States to other parts of the world, and American employees being sent to work overseas. After the 1980s, the concept changed to include foreign investment coming into the United States from Europe and Japan, and foreign companies building facilities on American soil.
The result is that many large multinational companies find that their organizational setup works well for operations in the United States, but not in other countries. “Many HR functions, like recruitment, selection, and training, are also optimized for U.S. operation,” says Peter Dowling, a fellow at the center and co-author of International Human ResourceManagement. “And HR finds it difficult to change because its experience base is domestic.” Whether the task is finding managers who can staff international offices, or training employees to work in other countries, change is slow. Typically, major changes don’t occur in HR until 60 percent of a company’s revenue is foreign.
|66||IBMand the Birth of Corporate Culture|
In the 1930s and 1940s, few companies paid serious attention to corporate culture. The exception was Thomas Watson, founder of IBM.
According to his son, Thomas Watson Jr., author of the best-seller Father, Son & Co. (Bantam Books, 1990), employees at IBM in the 1930s earned well-above-average salaries, worked in clean shops, attended free company-sponsored concerts, and were invited to night courses to learn how to get promoted.
Early IBMers also adhered to a strict dress code while working alongside the now famous “Think” sign. Watson’s message? That employees would advance faster if they used their heads.
The results of Watson’s culture-building were impressive: IBM dominated the market in the ’30s and ’40s, successfully expanded its monopoly overseas, and managed to avoid unionization.
|67||HigherEducation for All|
Following World War II, the GI Bill greatly expanded the concept of higher education and the nature of work. Although a pension plan had been established for veterans after the Civil War, it was largely gutted after World War I. The broken promise ignited a march on Washington during the Depression by angry vets demanding recompense. They were attacked by federal troops. The debacle was so traumatic for the country that Congress vowed to treat veterans more generously. Thus was born the GI Bill, legislation making it possible for millions to pursue higher education. About 7.8 million World War II veterans received benefits, and 2.2 million of those used the bill for higher education. By 1947, half of all college students were veterans.
Colleges, in turn, received years of financial security. Grants became more prevalent, and student bodies exploded. “Practical” degree programs in fields such as business were established. Veterans of all backgrounds, ages, and religions poured into community and state colleges, changing the complexion of higher education and, consequently, the nature and needs of the workplace.
In 1925, labor leader A. Philip Randolph organized the Brotherhood of Sleeping Car Porters, the first black union in American history. “The group fought the Pullman Company for 12 years, but they finally won recognition,” says Norm Hill, president of the A. Philip Randolph Institute. As a result of the organization’s success, Randolph became a visible spokesperson for African-American rights in the 1940s and 1950s. He focused on making sure that blacks weren’t discriminated against in government jobs. He also influenced the formation of the Fair Employment Practices Committee and was instrumental in the enactment of an executive order barring discrimination in the military.
In 1963, he led a march on Washington, D.C., for jobs and freedom, an event that rallied 250,000 people. Following the peaceful demonstration, Randolph, Martin Luther King Jr., and other black leaders met with President Kennedy, and within a year, the Civil Rights Act of 1964 was enacted.
“In many ways he was the father of the modern civil rights movement,” Hill says. “Andas the workforce is increasingly populated by minorities and women, Randolph’s work with the brotherhood should serve as a model for the low-wage worker.”
Employers haven’t always invested time, effort, and money in helping people they’ve laid off. Three decades ago, it was a novel idea to put aside resources for job counseling and placement to assist employees who had lost their jobs.
It was a way of thinking about employees with more compassion, and one that began to evolve at a time of increasing layoffs, which were associated with waves of mergers and acquisitions, says John Challenger, CEO of the outplacement firm Challenger, Gray & Christmas, Inc., in Chicago. Many people were losing their jobs through no fault of their own, and all parties involved had a vested interest in creating a system that helped deal with it. HR managers felt responsible to employees, and companies wanted to avoid litigation related to layoffs.
Recently, outplacement packages have gotten less generous and have been less effective in helping employees find new work, says Kate Wendleton, CEO of the Five O’Clock Club, a career counseling network based in New York. The firm’s COO, Richard Bayer, says employers today use outplacement services to help maintain the morale and productivity of the existing workforce.
In the 1930s, there was a surplus of labor and a shortage of financial capital. In an effort to better manage this limited resource, companies developed accounting and measurement systems that closely tracked their financial progress.
The growing importance of
intangible assets such asemployee knowledge and skill setshas changed the role of HR.
But over the years, the market has come to recognize that there is more to the valuation of companies than what is reflected in traditional accounting systems. Buyers and analysts are, increasingly, valuing companies at levels much higher than what is seen on the balance sheet. “We are entering a new era in which intangibles such as intellectual capital matter to stock price,” says HR professor John Boudreau of Cornell University.
The growing importance of intangible assets such as employee knowledge and skill sets has changed the role of HR. The good news is that HR activities have risen in stature. The bad news is that HR is being charged with the daunting task of determining how to measure and manage something so intangible.
“But the real challenge is not one of measurement,” Boudreau says. “It’s that we don’t have a logical point of view about how talent drives organizational success.” It will be up to HR professionals to create this new system of accountability.
|71||Womenin the Workplace|
Of all the pivotal events affecting human resources in the 20th century, none had a more dramatic impact than the legions of American women who entered the job market. In the 1920s, about 20 percent of the nation’s women held jobs; by the end of the century, the number had tripled to 60 percent. (In 1995, 8 out of 10 women between the ages of 20 and 44 were employed.)
Born of the industrial revolution of the 19th century, the women’s movement of the 1960s was a major catalyst for political, social, and educational equality. From the beginning, feminists fought for issues directly affecting HR-ranging from access to employment, education, child care, contraception, and abortion, to equality in the workplace, changing family roles, the need for equal political representation, and redress for sexual harassment in the workplace.
In the second half of the 20th century, several key events thrust American women into a world of unimagined economic independence. The Food and Drug Administration approved the use of birth control pills (1960). The Equal Pay Act (1963) made it illegal for employers to pay a woman less than a man for the same job. President Lyndon Johnson’s expansion of affirmative-action policies ensured that women and minorities would have the same employment opportunities as white men (1967). Title IX of the Education Amendments led to a large increase in the number of women in athletic programs and professional schools (1972). The Pregnancy Discrimination Act ensured that a pregnant woman can’t be fired or denied a job or promotion because she is or might become pregnant.
Despite extraordinary gains, the National Partnership for Women and Families reports that women are heads of most poor families and are still clustered in low-paying, traditionally female occupations.
These programs, now an expected benefit at most large companies, evolved out of alcoholism intervention in the workplace by fellow employees, unions, and/or employers in the mid-20th century, says Margaret Altmix, president of the Chicago-based accrediting group Employee Assistance Society of North America. They also were a natural outgrowth of the occupational health movement in the 1960s.
Employers began to recognize that their workers had the same problems that were reflected in the larger community, says Gregory P. DeLapp, immediate past president of the Employee Assistance Professionals Association Inc. in Arlington, Virginia. Companies were losing employees, and with them the resources that had been invested in them, because of illnesses and personal problems that were treatable with counseling.
Eventually, EAPs evolved to encompass many issues that affect the well-being of employees and their ability to perform at work — ranging from divorce to post-traumatic stress syndrome. What began as an employee-recovery social movement has evolved into part of the basic fabric of today’s workplace, DeLapp says. The catastrophic events of September 11, and the acute needs of the people affected, demonstrate the value of EAPs, he adds.
|73||TheNational Labor Relations Act|
The HR community has been directly affected by the two dominant statutes governing labor-management relations: the National Labor Relations Act and the Labor Management Relations Act. These statutes created the National Labor Relations Board, a federal agency that has two roles. The board receives, investigates, and resolves unfair labor practice complaints against unions and employers. Unfair labor practices include employer or union interference with employee rights to engage or refrain from engaging in concerted activities; employer or union refusals to bargain in good faith; and discrimination in employment. Certain union picketing activities, such as secondary boycotts, are also banned.
The agency also oversees representation elections. There, employees and unions can request that the board conduct a secret-ballot election to vote on whether employees wish to have union representation.
|74||ASecure Old Age|
Despite the reality that Generation Xers tend to be cynical about Social Security, it continues to be one of the most popular U.S. government programs ever created. It began as a way of caring for the elderly, and as a way of opening up jobs for younger workers.
Over the years, there have been attempts at supplementing Social Security with employer pension plans, in which a company would invest money for an employee based on tenure. The popularity of such plans, though they still are used in many companies, is starting to fade as jobs have become more portable.
More recently, plans such as 401(k)s have helped employees with retirement. Many people, however, don’t fully understand the concept, and don’t participate, or cash in too early to fully benefit.
Steve Sass, author of the book The Promise of Private Pensions (Harvard University Press, 1997), says Social Security has made a huge difference in workers’ lives. “Old people are no longer poor, and once they were.”
|75||TheAmericans with Disabilities Act|
For the past eight decades, HR professionals have been expected to be keen students of changing laws, savvy interpreters of new employment and discrimination legislation, and authoritative spokesmen for the rights of both employer and employee.
The Americans with Disabilities Act of 1990, for example, was a child of an earlier federal law, the Rehabilitation Act of 1973, which had been limited to federal contractors and entities that were receiving federal financial assistance. The ADA prohibits employment discrimination against qualified individuals with disabilities in the private sector, and in state and local governments. The EEOC was given enforcement power over the federal act.
The next year brought the Civil Rights Act of 1991. Among other things, it provides monetary damages in cases of intentional employment discrimination and was a key piece of legislation with direct impact on the workplace. Following a congressional vote to overturn a series of conservative Supreme Court decisions related to workplace discrimination, Congress went further, providing that, as with the Age Discrimination in Employment Act, jury trials would be available in ADA cases. For the first time, compensatory and punitive damages could be awarded.
Frederick Winslow Taylor’s theory of scientific management made him extremely unpopular with workers in the 1890s, says Robert Kanigel, MIT professor of science writing and author of The One Best Way: Frederick Winslow Taylor and the Enigma ofEfficiency. Taylor’s theory says that production efficiency can be greatly enhanced by closely watching individual workers in order to find and eliminate the wasted time and motion in the operation.
Taylor’s influence onimproving
cost-effectiveness in mass production can’t be dismissed.
With precise observation, management could identify the “one best way” to do a job, determine the correct productivity level, and set a pay rate based on that level, Kanigel says. Those who did not make that level would earn less money.
Though the system provoked resentment, Taylor’s influence on improving cost-effectiveness in mass production can’t be dismissed, Kanigel says. “We live in a world where common daily products are dirt cheap, and part of that derives from efficient production. When we condemn his excesses, it’s important to remember we owe him some of our material prosperity.”
Kanigel says Taylor’s theory is bad management practice: “People don’t like to be told in elaborate detail how to do their jobs.”
|77||TitleVII of the Civil Rights Act|
This sweeping legislation created job protections and opportunities that have served as the foundation of HR employment practices for nearly four decades. The Civil Rights Act of 1964 principally addressed race discrimination, sex bias, and also discrimination based on color, national origin, and religion. The law created an administrative charge-processing system that gave a complaining employee or applicant the right to file a suit in federal court.
At first, most of the significant litigation involved class actions to dismantle race-based seniority systems and restrictions against blacks in blue-collar industries. Then, in 1972, Title VII was amended to give the Equal Employment Opportunity Commission its own enforcement authority. This meant that the EEOC, along with its investigation and conciliation of charges of discrimination, could file its own lawsuits.
Along with its increased enforcement authority, bureaucratic headaches surfaced for the EEOC. The agency itself became mired in an increased backlog of charges of discrimination that never seemed to be investigated. Over the next 20 years, the EEOC increased its professionalism and reduced its backlog.
It started 40 years ago in an Illinois basement, with about 10 students in a management-training class. Today, 200 pupils attend classes at Hamburger University, and more than 65,000 managers are graduates.
“McDonald’s was really on the leading edge of what was taking place 40 years ago,” says Pat Burke, a vice president for Drake Beam Morin and a training expert. “Then, a number of other organizations started paying attention.”
The global giant was one of the first to consider different ways of educating employees. Over the years, its menu has changed, and so has its training. Courses are now available in 22 languages. To accommodate a growing range of cultures and languages, the corporation is using more animation and graphics and less text in its training materials. McDonald’s has 10 training centers worldwide, including facilities in England, Japan, Germany, and Australia.
There’s now a digital component to the university: e-training that students can access on the Web.
The history of HR is laced with management initiatives that promised to solve some vexing business issue. In the 1970s, transactional analysis was the rage. In the 1980s, the one-minute manager and quality circles held sway. And in the 1990s, TQM, self-directed teams, re-engineering, empowerment, and emotional intelligence all had their 15 minutes of fame.
Although these efforts are well-intentioned, and many have successfully transformed companies, the endless cycle of management programs has created legions of workplace skeptics who regard the latest directive from management as nothing but a flavor of the month.
“A lot of so-called HR fads have been influential,” says HR professor John Boudreau of Cornell University. But problems arise when HR professionals begin to chase fads simply because other companies do. “If HR continues to look externally for answers, it will continue to be regarded as a profession that can’t think independently and account for its impact.”
Sure, it’s hard work not to follow the herd, especially when conferences, magazines, and HR leaders join together in support of a particular profit-building initiative. But for HR to become strategic, that’s exactly what must happen.
|80||TheAge Discrimination in Employment Act|
The law was enacted in 1968 to protect employees 40 and older. Very little litigation developed under the ADEA until the late 1970s, when corporations started restructuring and reducing their over-40 workforces. Plaintiff lawyers also discovered that age-discrimination lawsuits had some punch because jury trials were available, and juries were sympathetic toward older workers who were fired.Gradually, the upper age limitation on the ADEA was shifted from 65 to 70, and then entirely removed. Enforcement authority transferred from the Department of Labor to the EEOC.
Employers, sensitive to the risks of age-discrimination lawsuits and the threat of whopping jury verdicts, developed counter tactics. For a little extra severance pay, terminated employees were asked to sign general releases agreeing not to sue. Congress then adopted legislation regulating the “when” and “how” of releases.
Workforce, January 2002, pp. 48-56 — SubscribeNow!