View Single Post
Old 02-18-21, 03:05 PM   #1952
LGN1
Ace of the Deep
 
Join Date: Mar 2006
Posts: 1,138
Downloads: 147
Uploads: 12
Default

Hi,


I think it's exactly the opposite with the sensitivity value of the visual sensor: larger values mean better detection. Here's why I think so:


- A larger detection time means a smaller probability of being spotted at a single instant in time.
- With growing distance the probability of being spotted is decreased
- The text says "at (sensitivity * max range) we have a double detection time". Suppose Max_range = 10000; sensitivtiy = 0.1 --> at 1000 we have twice the detection time and thus half the probability at one instance in time. Suppose now sensitivity = 1 --> only at 10000 we have twice the detection time, i.e., the probability of being detected drops much slower with distance than in case of sens.=0.1.


To find out the truth I propose to create a test setup and make two experiments with sens.=0.1 and sens.=0.9 or even larger. The results should be very different.


Best, LGN1
LGN1 is offline   Reply With Quote