- Joined
- Jun 11, 2010
- Messages
- 6,853
- Location
- Powhatan, Virginia, United States
- Tagline
- WassupYa Mang?
Ed or Joe,
I've been messing around with HPIB enabled components - HP8903A analyzer and a HP3455A meter. I have an IEE-488 USB adapter - acts like an old fashioned COM port so it's pretty easy to use to communicate with both.
What I would like to do is a routine that will inject a preamp level signal into an amp, and measuring the output "calibrate" to a specific wattage as required. I do have that working somewhat now, but not quite refined. It's like you'd first want to do a good "estimate" on what input voltage to apply then based on an optimum increment (hmmm.. even a stored "cal factor" but getting off point here) really dial it in.
Also would like the same type of "cal" routine but based on stopping the increase of input where an exact (or pretty close) measurement of distortion limit would be met.
So I'm thinking.. how to actively measure and calculate the gain of an amp, then based on that figure calculate the optimal voltage and step increment settings to use when performing the tests above - specific wattage and a distortion limit test where it checks at what wattage the amp is putting out when the desired limit is met. And perhaps calculate an estimated input voltage to apply when after the initial measurement is met and the gain predetermined, i.e. the difference btwn that and the desired measurement in the range of the input needed.
So first it would perhaps start out at a specific (but low.. don't want to blow it up) voltage input, and measure the output. I am assuming multiple measurements need to be made when increasing the input voltage, right? Then from that, or am I off base?
Thanks!
I've been messing around with HPIB enabled components - HP8903A analyzer and a HP3455A meter. I have an IEE-488 USB adapter - acts like an old fashioned COM port so it's pretty easy to use to communicate with both.
What I would like to do is a routine that will inject a preamp level signal into an amp, and measuring the output "calibrate" to a specific wattage as required. I do have that working somewhat now, but not quite refined. It's like you'd first want to do a good "estimate" on what input voltage to apply then based on an optimum increment (hmmm.. even a stored "cal factor" but getting off point here) really dial it in.
Also would like the same type of "cal" routine but based on stopping the increase of input where an exact (or pretty close) measurement of distortion limit would be met.
So I'm thinking.. how to actively measure and calculate the gain of an amp, then based on that figure calculate the optimal voltage and step increment settings to use when performing the tests above - specific wattage and a distortion limit test where it checks at what wattage the amp is putting out when the desired limit is met. And perhaps calculate an estimated input voltage to apply when after the initial measurement is met and the gain predetermined, i.e. the difference btwn that and the desired measurement in the range of the input needed.
So first it would perhaps start out at a specific (but low.. don't want to blow it up) voltage input, and measure the output. I am assuming multiple measurements need to be made when increasing the input voltage, right? Then from that, or am I off base?
Thanks!