Hyperspectral imaging is a type of imaging that captures not just red, green, and blue like a regular camera, but hundreds of very narrow color (wavelength) bands across parts of the spectrum (typically visible through near‑infrared and sometimes beyond). For every pixel, it records a detailed spectrum—often called a “spectral fingerprint”—that can be used to identify materials or subtle differences (e.g., types of paint, minerals, vegetation, or man‑made objects). Regular imaging (RGB or multispectral) only records a few broad bands, so it can show how things look but usually cannot reliably distinguish materials that appear similar to the human eye.
In this context, a “domain-centric path” means that the AI and sensing system are designed around the real-world domain (coastal and aquatic environments, naval targets, and the physics of light and materials), not just around generic AI models. The CHROMA experiment embeds expert knowledge about the environment, target materials, and sensors into how data are collected and how AI is trained, so the models are more trustworthy and effective for specific naval and environmental missions.
The CHROMA experiment uses multiple sensor types and platforms in a coastal/aquatic-like environment:
The experiment is led at NRL by Katarina Doctor, Ph.D., who is identified as the CHROMA Project Lead. Key NRL leadership quoted in the article also include Gautam Trivedi, Ph.D. (Information Operations Branch Head) and Joey Mathews (Information Technology Division Superintendent), but Doctor is the named project lead for CHROMA.
The Navy Department and the broader scientific community are expected to use CHROMA’s results mainly through:
Yes, at least the data will be openly shared; the article explicitly states that ROCX (of which CHROMA is a part) will produce comprehensive hyperspectral datasets that "will be shared openly with the remote sensing community" via an open-access repository. The Navy/NRL pieces do not explicitly say that trained AI models themselves will be released, only that the data will support AI development; so public data access is confirmed, but public release of the full AI models is not stated.
According to NRL, the CHROMA experiment “ran Sept. 4–19, 2025” as part of the ROCX 2025 campaign at RIT’s Tait Preserve in New York. ROCX planning documents indicate a primary experiment window of September 8–19, 2025, with intensive ground and flight data collection completed by October 1, 2025, and a goal to release the compiled open-access datasets by mid‑2026. So the field experiment itself lasted about two weeks in early–mid September 2025, and the data processing and release phase is expected to continue into mid‑2026.
The articles describe the AI work in functional rather than algorithmic terms, but they indicate that CHROMA is testing: