Resample (constant to constant) VI

Owning Palette: Signal Operation VIs

Requires: Full Development System

Resamples input signal X according to delay and dt using an FIR filter implementation. Wire data to the X input to determine the polymorphic instance to use or manually select the instance.

Details  Example

Use the pull-down menu to select an instance of this VI.

 Add to the block diagram  Find on the palette

Resample (constant to constant, single-channel)

anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
reset? controls the initialization of the internal states. The default is FALSE. The first time this VI runs or when reset? is TRUE, LabVIEW initializes the internal states to zero. When reset? is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this instance of this VI. To process a large data sequence consisting of smaller blocks, set this control to TRUE for the first block and to FALSE for continuous filtering of all remaining blocks.
X contains the input signal for resampling. The sampling interval of X is 1.
delay specifies the timestamp for Y.
dt specifies the sampling interval for Y.
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. If alias rejection (dB) is less than 48, this VI will use 48 instead. The default is 120.
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
Y returns the resampled signal.
t0 returns the time instance for the first sample in Y.
error out contains error information. This output provides standard error out functionality.

Resample (constant to constant, multi-channel)

anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
reset? controls the initialization of the internal states. The default is FALSE. The first time this VI runs or when reset? is TRUE, LabVIEW initializes the internal states to zero. When reset? is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this instance of this VI. To process a large data sequence consisting of smaller blocks, set this control to TRUE for the first block and to FALSE for continuous filtering of all remaining blocks.
X contains the input signals for resampling. Each row of X contains an input signal. All the signals have the same length. The sampling interval of each signal in X is 1.
delay specifies the timestamp for Y.
dt specifies the sampling interval for Y.
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. If alias rejection (dB) is less than 48, this VI will use 48 instead. The default is 120.
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
Y returns the resampled signals. Each row of Y contains a resampled signal that corresponds to the input signal in the same row of X. All the signals have the same length.
t0 returns the time instance for the first sample in Y.
error out contains error information. This output provides standard error out functionality.

Resample (constant to constant, complex single-channel)

anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
reset? controls the initialization of the internal states. The default is FALSE. The first time this VI runs or when reset? is TRUE, LabVIEW initializes the internal states to zero. When reset? is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this instance of this VI. To process a large data sequence consisting of smaller blocks, set this control to TRUE for the first block and to FALSE for continuous filtering of all remaining blocks.
X contains the complex input signal for resampling. The sampling interval of X is 1.
delay specifies the timestamp for Y.
dt specifies the sampling interval for Y.
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. If alias rejection (dB) is less than 48, this VI will use 48 instead. The default is 120.
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
Y returns the complex resampled signal.
t0 returns the time instance for the first sample in Y.
error out contains error information. This output provides standard error out functionality.

Resample (constant to constant, complex multi-channel)

anti-aliasing? specifies whether the input signal undergoes lowpass filtering when LabVIEW downsamples the signal. If anti-aliasing? is TRUE (default), this VI protects the resampled signal from aliasing. However, the computation requirements increase during resampling.
reset? controls the initialization of the internal states. The default is FALSE. The first time this VI runs or when reset? is TRUE, LabVIEW initializes the internal states to zero. When reset? is FALSE, LabVIEW initializes the internal states to the final states from the previous call to this instance of this VI. To process a large data sequence consisting of smaller blocks, set this control to TRUE for the first block and to FALSE for continuous filtering of all remaining blocks.
X contains the input real signals for resampling. Each row of X contains an input signal. All the signals have the same length. The sampling interval of each signal in X is 1.
delay specifies the timestamp for Y.
dt specifies the sampling interval for Y.
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
FIR filter specifications specifies the minimum values this VI needs to specify the FIR filter.
alias rejection (dB) specifies the minimum attenuation level of signal components aliased after any resampling operation. If alias rejection (dB) is less than 48, this VI will use 48 instead. The default is 120.
normalized bandwidth specifies the fraction of the new sampling rate that is not attenuated. The default is 0.4536.
Y returns the resampled signals. Each row of Y contains a resampled signal that corresponds to the input signal in the same row of X. All the signals have the same length.
t0 returns the time instance for the first sample in Y.
error out contains error information. This output provides standard error out functionality.

Resample (constant to constant) Details

Related Information

Passband Ripple and Stopband Attenuation

Example

Refer to the Constant-to-Constant Resampling VI in the labview\examples\Signal Processing\Waveform Conditioning directory for an example of using the Resample (constant to constant) VI.

 Open example  Find related examples