enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Point spread function - Wikipedia

    en.wikipedia.org/wiki/Point_spread_function

    By virtue of the linearity property of optical non-coherent imaging systems, i.e., . Image(Object 1 + Object 2) = Image(Object 1) + Image(Object 2). the image of an object in a microscope or telescope as a non-coherent imaging system can be computed by expressing the object-plane field as a weighted sum of 2D impulse functions, and then expressing the image plane field as a weighted sum of the ...

  3. Optical transfer function - Wikipedia

    en.wikipedia.org/wiki/Optical_transfer_function

    Optical systems, and in particular optical aberrations are not always rotationally symmetric. Periodic patterns that have a different orientation can thus be imaged with different contrast even if their periodicity is the same. Optical transfer function or modulation transfer functions are thus generally two-dimensional functions.

  4. Optical flow - Wikipedia

    en.wikipedia.org/wiki/Optical_flow

    In some cases the processing circuitry may be implemented using analog or mixed-signal circuits to enable fast optical flow computation using minimal current consumption. One area of contemporary research is the use of neuromorphic engineering techniques to implement circuits that respond to optical flow, and thus may be appropriate for use in ...

  5. Aperture synthesis - Wikipedia

    en.wikipedia.org/wiki/Aperture_synthesis

    Aperture synthesis is possible only if both the amplitude and the phase of the incoming signal are measured by each telescope. For radio frequencies, this is possible by electronics, while for optical frequencies, the electromagnetic field cannot be measured directly and correlated in software, but must be propagated by sensitive optics and interfered optically.

  6. Computational imaging - Wikipedia

    en.wikipedia.org/wiki/Computational_imaging

    Computational imaging systems span a broad range of applications. While applications such as SAR, computed tomography, seismic inversion are well known, they have undergone significant improvements (faster, higher-resolution, lower dose exposures [3]) driven by advances in signal and image processing algorithms (including compressed sensing techniques), and faster computing platforms.

  7. Pupil function - Wikipedia

    en.wikipedia.org/wiki/Pupil_function

    The pupil function or aperture function describes how a light wave is affected upon transmission through an optical imaging system such as a camera, microscope, or the human eye. More specifically, it is a complex function of the position in the pupil [ 1 ] or aperture (often an iris ) that indicates the relative change in amplitude and phase ...

  8. Aperture - Wikipedia

    en.wikipedia.org/wiki/Aperture

    The sampling aperture can be a literal optical aperture, that is, a small opening in space, or it can be a time-domain aperture for sampling a signal waveform. For example, film grain is quantified as graininess via a measurement of film density fluctuations as seen through a 0.048 mm sampling aperture.

  9. Optical computing - Wikipedia

    en.wikipedia.org/wiki/Optical_computing

    Optical computing or photonic computing uses light waves produced by lasers or incoherent sources for data processing, data storage or data communication for computing.For decades, photons have shown promise to enable a higher bandwidth than the electrons used in conventional computers (see optical fibers).