when filtering out sub-optimal data such as records with a lower signal strength
Any readings with a distance of 1cm indicate the sensor was unable to complete a ranging before gathering the maximum threshold quantity of micro-samples. Any reading with a distance of 1cm should be discarded.
Signal strength varies wildly depending on the environment, geometry and materials etc. Therefor it is unwise to base any decisions off a raw value of signal strength, instead of considering the signal strength relative to surrounding readings in a scan.
signal strength that suggest that the other components of the data-block may be compromised?
Signal strength will never indicate that a data block is compromised in the sense of an error or corrupted data transmission, but signal strength can be an indication of confidence in the measurement. The signal strength does not guarantee anything about the accuracy of the reading, rather it conveys a confidence in the process used to obtain the reading... which usually correlates with accuracy. See the manual and the article on "Theory of Operation" for more details, and feel free to ask more questions things are still unclear!
I have very solid lines of interconnected dots. In your Visualizer are you sampling e.g use every 5th databkock for plotting?
Hmmm The visualizer does not perform any kind of sampling. It draws a point for every valid reading in a scan. However it draws an entire 360 degree scan at once, and clears it before drawing the next scan. It does not draw the readings one by one, or leave multiple scans on the screen at the same time. Perhaps your application is leaving points from previous scans on the canvas, such that over time it appears more dense? The visualizer does provide a feature called
Sweep Decay History under the
Render Settings sub-panel that will persist a small history of scans on the screen. You can experiment with that to see if it looks more like what you are seeing.