Question: Heres a C-programming question. Lets say your code has defined a 16-bit signed integer, and reads a 12-bit value from the ADC. What is the
Heres a C-programming question. Lets say your code has defined a 16-bit signed integer, and reads a 12-bit value from the ADC. What is the maximum value that your code can multiply the ADC value by before storing the result in the 16-bit signed, integer variable, without losing any bits? Now, suppose that for whatever reason you need to multiply the ADC value by 74 but that you also need to divide it by a normalization factor of 165. Write the code, including the variable and constant (if any) definitions, to perform this calculation with minimal loss of accuracy in the final, signed, integer formatted answer.
Step by Step Solution
There are 3 Steps involved in it
Get step-by-step solutions from verified subject matter experts
