Jim comes up with an algorithm which takes bit operations to handle an input text with n words.
Suppose the computers in your business can handle one bit operation every nanosecond ( nanosecond seconds).
How many nanoseconds would it take Jim's algorithm to convert a text with words on these computers?
How many DAYS would it take Jim's algorithm to convert a text with words on these computers?( Do not round your answers for WeBWorK.)
(Recall a million is , a billion is and a trillion is .)
For an input text of words, the statement that best describes the performance of Jim's algorithm is: A. His algorithm would take between and years to run. B. His algorithm would take between thousand and million years to run. C. His algorithm would take between billion and trillion years to run. D. His algorithm would take between and years to run. E. His algorithm would take between and years to run. F. His algorithm would take between million and billion years to run. G. His algorithm would take more than trillion years to run.
You can earn partial credit on this problem.