View Single Post
  #1   Report Post  
Posted to microsoft.public.excel.misc
tibado tibado is offline
external usenet poster
 
Posts: 1
Default How to show % increase when accuracy decreases?

Ok, I'm stuck on this one and hope you can help

Here is the sample data

Date 1 2 3

Made 1167 1318 1225

Passed 1121 1203 1209

Passed - As a % of Made 96% 91% 99%

% Increase in Accuracy 0% X% Y%

1. The €˜Date will continue for an indeterminate amount of time.
2. The €˜Made are the total produced for each €˜Date and may increase or
decrease from the prior €˜Date.
3. The €˜Passed are derived from how many of the €˜Made did not fail.
4. The €˜Passed €“ As a % of Made is equal to €˜Passed divided by €˜Made as
shown as a % and could never exceed 100%.
a. Example: For Date 1 1121 / 1167 * .01 = 96%
5. The €˜% Increase in Accuracy is the problem, I need to derive for €˜Date 2
the percentage of increase of €˜Passed parts as it is related to €˜Date 1
and show it as an increase or decrease trend towards 100% accuracy.
6. Then I need to derive for €˜Date 3 the percentage of increase of €˜Passed
parts as it relates to €˜Date 2 and so on€¦

So I need to solve for X% and Y% as a trend towards the goal of 100% passed.

Thanks in Advance.