|
Post by rod on Sept 10, 2020 5:39:20 GMT -6
Yes I know ....... think this subject was chewed on a bit in TCF.
HOWEVER ....... Is it the case (maybe its John Deere) that there is the claim that loss sensors are speed indexed? I thought someone with actual knowledge on this matter said they were ...... but I'm not sure if that's ever been proven for or against by an independent testing organisation. Logic tells me this shouldn't be the case, but I'm curious as the discussion rose today in regards to loss measurements, & calibrating loss monitors. Anyone know for certain one way or the other?
|
|
|
Post by Albertabuck on Sept 10, 2020 8:55:00 GMT -6
Are you talking about loss monitors sensing loss out the back and the results being adjusted for changes in speed of the machine? If so they have had that for years, and its to basically compensate for the increased amount of loss as speed increases but the monitor continues to read the same. Its not the sensors themselves, but the brains of the system that does it. Both of my pull types from the 80s have it, Case IH and Versatile, and both have a manual adjustment where you can set the system for a certain speed and correlate your chosen amount of loss to a reading on the monitor, and then it automatically goes from there on its own as ground speed changes up or down. Would think the newer ones would have to use the same tech, though wouldn't surprise me if they have taken away the ability to adjust the reading when setting it all up.
If I misunderstood your question my apologies.
|
|
|
Post by rod on Sept 10, 2020 15:16:39 GMT -6
Yes ..... basically that’s the premise of the question. I find the logic of a (grain) loss sensor to compensate for speed to be somewhat confronting. If I’m understanding the claim correctly, in broad terms, the faster you went ie more throughput, there would be logically more grain loss (unless some settings were changed to compensate for more throughput) & therefore the loss sensor (brains) would recognise the loss increase & the speed increase ...... & then adjust the “display readout” to reflect this. Is this how it works or have I got the bull by the horns?
|
|
|
Post by Albertabuck on Sept 10, 2020 19:09:12 GMT -6
Yep we are both on same page here and yes you are correct. I gotta run right now, but the idea of why they did it, was so you could have some one literally run the machine speed wise based on maintaining the position of the meter or lights ect however it works. In other words, the monitor was supposed to do the thinking instead of the operator. Not sure but I think its in the book for the system on the 1682 that explains it best, might be the Versy, but I have a special manual for the Grain Scan system on the 1682 which I always assumed is the same as what they run on the 1680 ect as well. And it explains how to set it for acceptable loss at a certain speed and then run the machine based off the monitor. I never relied on it at all, I do keep an eye on it, but I always check under all different loads and speeds by whats on the ground, guess honestly with both machines, I actually use the monitor as an indicator that everything is working proper. Especially on the old Versy 2K, if tailings or anything else started acting up, monitor would go off the deep end, if the red lights come on and stayed on, you better be stepping on the clutch and yanking back on the PTO cause you probably got belts smoking lol
I'll try to look into how they technically explain it. Gimme a day or two, Thanks.
|
|