How to show % increase when accuracy decreases?
Note: In example 4a, you should not be multiplying by .01 as that would give
you less than 1% as an answer.
I think you've overthinking the problem. The change is -5% and 8%
respectively. Just subtract the two values.
Note that even though you passed more parts, the raiot of pass/made
decreased, thus your percentage is negative (process is getting worse, not
better)
--
Best Regards,
Luke M
*Remember to click "yes" if this post helped you!*
"tibado" wrote:
Ok, I'm stuck on this one and hope you can help
Here is the sample data
Date 1 2 3
Made 1167 1318 1225
Passed 1121 1203 1209
Passed - As a % of Made 96% 91% 99%
% Increase in Accuracy 0% X% Y%
1. The €˜Date will continue for an indeterminate amount of time.
2. The €˜Made are the total produced for each €˜Date and may increase or
decrease from the prior €˜Date.
3. The €˜Passed are derived from how many of the €˜Made did not fail.
4. The €˜Passed €“ As a % of Made is equal to €˜Passed divided by €˜Made as
shown as a % and could never exceed 100%.
a. Example: For Date 1 1121 / 1167 * .01 = 96%
5. The €˜% Increase in Accuracy is the problem, I need to derive for €˜Date 2
the percentage of increase of €˜Passed parts as it is related to €˜Date 1
and show it as an increase or decrease trend towards 100% accuracy.
6. Then I need to derive for €˜Date 3 the percentage of increase of €˜Passed
parts as it relates to €˜Date 2 and so on€¦
So I need to solve for X% and Y% as a trend towards the goal of 100% passed.
Thanks in Advance.
|