Hello all,
I have a spreadsheet with two worksheets. The first is a list of employee data, with a unique employee in each row, city, and salary. There is a blank column to the right, for geographic differential (cities can be more or less expensive to live in.) The cities are standardized across the company, so they can work as a lookup value.
The second is a list of cities, and the different geographic salary multipliers in each. The geographic multipliers are different for different salaries. In Atlanta, for instance, people making around $20k see pay increased by 4.3%. and people making $90k see a lower differential of 1.0%. I should note that differentials can be negative as well. In the attached spreadsheet, the differentials should be 4.6% and -4.0%, respectively.
What I want to do is find the bracket that each employee's salary is closest to, using a standard mathematical round, and return the percentage by which the salary should be increased or decreased. I know that I need arrays, index, and match, but this is the most complicated formula I have ever built and it's making my head spin. Any help is, as always, much appreciated!
Excel Help.xlsx
Bookmarks