Skip to content

AeroFocus

AeroFocus is a PyQt6 desktop application for analyzing LWIR aerial image quality, with a focus on blur detection, motion correlation, and frame classification (keep/discard/mask). It was built to help understand which frames from a flight are usable for downstream super-resolution and to explore the relationship between aircraft motion and image blur.

Repository: /home/geoff/projects/ceres/superrez/aerofocus/ Status: Last active April 10, 2025. Development spanned 4 days (April 7-10, 2025). Reached a functional prototype with Laplacian and FFT blur analysis, HSV visualization, motion data graphs, and 51 passing tests. Not actively maintained.

Purpose and MFSR Pipeline Role

AeroFocus addresses a fundamental question for the MFSR pipeline: which frames from a flight are sharp enough to be useful for super-resolution?

In multi-frame super-resolution, the quality of the output depends heavily on the quality of the input frames. Blurry frames (caused by aircraft motion, turbulence, or focus issues) degrade the SR result. AeroFocus was designed to:

  1. Detect and quantify blur in individual LWIR frames using spatial (Laplacian) and frequency (FFT) domain methods
  2. Correlate blur with aircraft motion by parsing INS/GPS log data and plotting angular velocities and accelerations alongside blur scores
  3. Enable frame classification -- marking frames as keep, discard, or partially usable (mask) based on blur analysis
  4. Identify motion thresholds that predict blur, which could feed into automated frame rejection criteria

This tool sits upstream of LWIR-Align and PIUnet in the pipeline. If a frame is too blurry, it should be rejected before alignment and SR processing.

Architecture

The application follows a modular MVC-like pattern with clear separation between data, processing, and UI:

Core Components

Component File Role
BlurDetector blur_detector.py Laplacian variance blur detection, blur scoring, HSV visualization
BlurManager blur_manager.py Orchestrates both Laplacian and FFT blur analysis, manages state
AnalysisController analysis_controller.py Central controller connecting UI to backend via Qt signals/properties
ImageProcessor image_processor.py Stateless image enhancement: contrast stretch, CLAHE, HSV colorization
ImageManager image_manager.py Image loading, navigation, signal emission
LogParser log_parser.py Parses CSV motion/GPS logs, calculates deltas and derived quantities

Widget Layer (widgets/)

  • image_display_panel.py / image_display_widget.py -- dual-panel image display
  • blur_analysis_panel.py / laplacian_blur_panel.py / fft_analysis_panel.py -- blur controls
  • motion_data_panel.py / graph_widget.py -- motion data visualization
  • histogram_panel.py -- intensity histogram
  • playback_controls.py -- image sequence playback
  • image_controller.py -- UI-level image state management

Data Flow

  1. ImageManager loads 16-bit LWIR images, applies CLAHE enhancement to produce 8-bit display images
  2. AnalysisController receives image-loaded signals, passes the enhanced image to BlurManager
  3. BlurManager runs Laplacian or FFT analysis, stores results
  4. AnalysisController pulls visualization and scores from BlurManager, emits signals to update the UI
  5. LogParser independently parses motion CSV logs and provides time-windowed data to MotionDataPanel

Blur Detection Methods

Laplacian Variance (Primary)

The main blur detection method (blur_detector.py) uses the variance of the Laplacian operator:

  1. Normalize image to [0, 1] float
  2. Apply Laplacian filter (skimage.filters.laplace)
  3. Compute local variance using a uniform filter of configurable window size (default 8 pixels)
  4. Normalize the variance map to [0, 1] where high variance = sharp
  5. Invert so high values = blurry
  6. Apply threshold to zero out regions below the blur threshold

Parameters: - window_size (default 8) -- local variance window, must be even - blur_threshold (default 0.1) -- minimum blur level to report - variance_threshold (default 1e-6) -- below this, the entire image is considered blurry (uniform)

The overall blur score is computed as the mean of the thresholded blur map, returning a float in [0, 1] where 1.0 = completely blurry.

FFT High-Pass Filtering (Secondary)

The BlurManager also implements an FFT-based analysis:

  1. Compute 2D FFT of the normalized image
  2. Shift zero-frequency to center
  3. Apply a circular high-pass filter (zeroing frequencies within a configurable radius)
  4. Inverse FFT to get the spatial-domain high-frequency content
  5. Visualize both the magnitude spectrum (optionally log-scaled) and the filtered spatial result

Blurry images will have weaker high-frequency content, visible as a dimmer filtered result.

Visualization

AeroFocus provides several visualization modes:

  • Colorized blur map -- HSV color mapping where red = blurry, blue = sharp (or inverted), with configurable saturation
  • Raw blur map -- grayscale where bright = blurry
  • FFT spectrum -- magnitude spectrum of the image's Fourier transform
  • FFT filtered result -- spatial-domain result after high-pass filtering
  • Intensity histogram -- matplotlib-generated histogram of pixel values
  • Motion data graphs -- delta roll/pitch/yaw, delta velocities, total angular velocity, total velocity change

Motion Data Integration

The LogParser parses CSV log files containing per-frame INS/GPS data:

  • Raw fields: roll, pitch, yaw, velocity (u/v/w), latitude, longitude, altitude, PDOP, camera focus steps, timestamps
  • Computed fields: delta roll/pitch/yaw, delta velocities, total angular velocity (magnitude of angular deltas), total velocity change (magnitude of velocity deltas)
  • Time windowing: can extract data for a configurable window around any timestamp

This enables correlating blur scores with aircraft motion to identify motion thresholds that predict blur -- a key goal for automated frame rejection.

DAQ Slide Deck

The daq-slide-deck/ subdirectory contains a separate project: an HTML presentation build system used for a CeresTalk titled "Stop Fighting Wraparound: Embed Your Rings!" (dated September 9, 2025). This presentation is about handling circular quantities (angles, time-of-day, hue) correctly in software -- a problem encountered during AeroFocus development when working with yaw angles and similar wrapped quantities.

The presentation is not about AeroFocus itself, but the circular-quantity math problem was motivated by the motion data analysis work in AeroFocus. The build system (presentation_builder.py, build.py) generates interactive single-file HTML presentations with embedded assets.

Development History

AeroFocus was developed over an intense 4-day sprint from April 7-10, 2025:

Date Milestone
2025-04-07 Project created. Core data handling (image loading, log parsing), initial GUI, navigation
2025-04-08 Major refactoring period ("refactor from hell"), ImageController state management, CLAHE fixes
2025-04-09 ImageManager replacing deprecated ImageHandler, histogram widget, image display refactoring
2025-04-10 Blur analysis panel, Laplacian blur detection integration, FFT analysis, final tests

Total: 30 commits, 51 passing tests at final commit.

Design Decisions and Constraints

  • No try/except blocks -- all error handling through return values and validation. This was a strict project rule.
  • 300-line file limit -- enforced modular decomposition
  • PyQt6 signal/slot architecture -- UI updates driven by signal emission, not direct method calls
  • Stateless ImageProcessor -- all methods are @staticmethod, no internal state
  • CLAHE on 8-bit for analysis -- blur detection runs on the CLAHE-enhanced 8-bit image, not the raw 16-bit data

Current State and Gaps

Completed (Phase 1-2)

  • Image loading, enhancement, and navigation
  • Laplacian blur detection with configurable parameters
  • FFT blur analysis
  • Blur visualization (colorized and raw maps)
  • Motion data parsing and visualization
  • Histogram display
  • HSV colorization
  • 51 tests passing

Not Implemented (Phase 3)

  • Frame classification tools (keep/discard/mask) -- the intended primary output
  • Correlation analysis between motion parameters and blur metrics
  • Batch analysis and export
  • Automated frame rejection criteria
  • Integration with downstream MFSR pipeline tools

Connection to MFSR Pipeline

AeroFocus was an exploratory tool to understand frame quality in the flight data. The intended workflow was:

  1. AeroFocus analyzes all frames from a flight, computing blur scores
  2. Motion thresholds are identified that predict blur
  3. These thresholds become automated rejection criteria
  4. Only frames passing quality checks proceed to LWIR-Align for registration and then to PIUnet or other SR networks

This automated frame rejection pipeline was never completed. The project was paused after the prototype phase, likely because the team shifted focus to other pipeline components. The blur detection core (Laplacian variance) is a standard and reasonable approach for LWIR imagery, though the threshold tuning and validation against actual SR performance was not done.