A archer releases an arrow with an initial velocity of 30 feet per second at a height of 3 feet. The path the arrow takes
can be modeled using the function f(x) = -16x² + 30x +3, where f(2) represents the height, in feet, of the arrow
and represents the time the arrow travels in seconds. How many seconds until the arrow hits the ground? Round
your answer to the nearest hundredth if necessary..