|
anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
|
|
reset controls the initialization of resampling. The default is FALSE. The first time this VI runs or when reset is TRUE, LabVIEW initializes the internal states of the VI to zero and uses start index to determine when the resampling starts. The next time this VI runs and reset is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this VI.
To process a large data sequence that consists of smaller blocks, set reset to TRUE for the first block and to FALSE for all the remaining blocks in continuous resampling.
|
|
X is the input real signal for resampling. The sampling interval of X is 1.
|
|
start index determines where the resampling starts for the first call to the VI or if reset is TRUE. Set the start index according to the signal after X is interpolated. start index must be greater than or equal to 0. The default is 0.
|
|
resample factor contains the interpolation factor and the decimation factor for resampling.
|
interpolation is the interpolation factor for resampling. The default is 1.
|
|
decimation is the decimation factor for resampling. The default is 1.
|
|
|
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
|
|
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
|
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. The default is 120.
|
|
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
|
|
|
Y returns the resampled signal. The sampling interval of Y is decimation/interpolation.
|
|
t0 returns the time instance for the first sample of each signal in Y.
|
|
error out contains error information. This output provides standard error out functionality.
|
|
anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
|
|
reset controls the initialization of resampling. The default is FALSE. The first time this VI runs or when reset is TRUE, LabVIEW initializes the internal states of the VI to zero and uses start index to determine when the resampling starts. The next time this VI runs and reset is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this VI.
To process a large data sequence that consists of smaller blocks, set reset to TRUE for the first block and to FALSE for all the remaining blocks in continuous resampling.
|
|
X is the input complex signal for resampling. The sampling interval of X is 1.
|
|
start index determines where the resampling starts for the first call to the VI or if reset is TRUE. Set the start index according to the signal after X is interpolated. start index must be greater than or equal to 0. The default is 0.
|
|
resample factor contains the interpolation factor and the decimation factor for resampling.
|
interpolation is the interpolation factor for resampling. The default is 1.
|
|
decimation is the decimation factor for resampling. The default is 1.
|
|
|
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
|
|
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
|
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. The default is 120.
|
|
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
|
|
|
Y returns the resampled signal. The sampling interval of Y is decimation/interpolation.
|
|
t0 returns the time instance for the first sample of each signal in Y.
|
|
error out contains error information. This output provides standard error out functionality.
|
|
anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
|
|
reset controls the initialization of resampling. The default is FALSE. The first time this VI runs or when reset is TRUE, LabVIEW initializes the internal states of the VI to zero and uses start index to determine when the resampling starts. The next time this VI runs and reset is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this VI.
To process a large data sequence that consists of smaller blocks, set reset to TRUE for the first block and to FALSE for all the remaining blocks in continuous resampling.
|
|
X contains the input real signals for resampling. Each row of X contains an input signal. All the signals have the same length. The sampling interval of each signal in X is 1.
|
|
start index determines where the resampling starts for the first call to the VI or if reset is TRUE. Set the start index according to the signal after X is interpolated. start index must be greater than or equal to 0. The default is 0.
|
|
resample factor contains the interpolation factor and the decimation factor for resampling.
|
interpolation is the interpolation factor for resampling. The default is 1.
|
|
decimation is the decimation factor for resampling. The default is 1.
|
|
|
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
|
|
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
|
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. The default is 120.
|
|
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
|
|
|
Y returns the resampled signals. Each row of Y contains a resampled signal that corresponds to the input signal in the same row of X. All the resampled signals have the same length. The sampling interval of each signal is decimation/interpolation.
|
|
t0 returns the time instance for the first sample of each signal in Y.
|
|
error out contains error information. This output provides standard error out functionality.
|
|
anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
|
|
reset controls the initialization of resampling. The default is FALSE. The first time this VI runs or when reset is TRUE, LabVIEW initializes the internal states of the VI to zero and uses start index to determine when the resampling starts. The next time this VI runs and reset is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this VI.
To process a large data sequence that consists of smaller blocks, set reset to TRUE for the first block and to FALSE for all the remaining blocks in continuous resampling.
|
|
X contains the input complex signals for resampling. Each row of X contains an input signal. All the signals have the same length. The sampling interval of each signal is 1.
|
|
start index determines where the resampling starts for the first call to the VI or if reset is TRUE. Set the start index according to the signal after X is interpolated. start index must be greater than or equal to 0. The default is 0.
|
|
resample factor contains the interpolation factor and the decimation factor for resampling.
|
interpolation is the interpolation factor for resampling. The default is 1.
|
|
decimation is the decimation factor for resampling. The default is 1.
|
|
|
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
|
|
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
|
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. The default is 120.
|
|
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
|
|
|
Y returns the resampled signals. Each row of Y contains a resampled signal that corresponds to the input signal in the same row of X. All the resampled signals have the same length. The sampling interval of each signal is decimation/interpolation.
|
|
t0 returns the time instance for the first sample of each signal in Y.
|
|
error out contains error information. This output provides standard error out functionality.
|
The following steps describe the rational resampling process. Each step corresponds to a numbered section of the following image.