I need to run a select from table with 400,000,000 records where there column is not indexed.
The column is a float and I'm simply doing a select where X > Y
I'm trying to calculate how long an operation like this should take.
My initial thought is was SQL reading 8K blocks would require about 5.5 Million reads. So assume it's taking 1ms per read, that's 1.6 hrs.
Now I know the logic and numbers are off, but I was wondering if this is a viable method of calculating something like this?