Friday, April 10, 2026

NASA’s Artemis II Crew Beams Official Moon Flyby Photos to Earth - UNIVERSE

The Moon, backlit by the Sun during a solar eclipse, is photographed by NASA’s Orion spacecraft on Monday, April 6, 2026, during the Artemis II mission. Orion is visible in the foreground on the left. Earth is reflecting sunlight at the left edge of the Moon, which is slightly brighter than the rest of the disk. The bright spot visible just below the Moon’s bottom right edge is Saturn. Beyond that, the bright spot at the right edge of the image is Mars.

Credit: NASA

Editor’s note: Some photo captions were updated on April 8, 2026, to reflect ongoing scientific observations and discussion about the images.

The first flyby images of the Moon captured by NASA’s Artemis II astronauts during their historic test flight reveal some regions no human has seen, including a rare in-space solar eclipse. Released Tuesday, astronauts captured the images April 6 during the mission’s seven-hour flyby of the lunar far side, showing humanity’s return to the Moon’s vicinity and opening a trove of scientific data.

NASA astronauts Reid Wiseman, Victor Glover, Christina Koch, and CSA (Canadian Space Agency) astronaut Jeremy Hansen, have used a fleet of cameras to take thousands of photos. The agency released several images, with more expected in the coming days as the crew members are more than halfway through their journey and now headed home toward Earth.

“Our four Artemis II astronauts — Reid, Victor, Christina, and Jeremy — took humanity on an incredible journey around the Moon and brought back images so exquisite and brimming with science, they will inspire generations to come,” said Dr. Nicky Fox, associate administrator, Science Mission Directorate, NASA Headquarters in Washington.

During the lunar flyby, the crew documented impact craters, ancient lava flows, and surface fractures that will help scientists study the Moon’s geologic evolution. They monitored color, brightness, and texture differences across the terrain, observed an earthset and earthrise, and captured solar‑eclipse views of the Sun’s corona. The crew also reported six meteoroid impact flashes on the darkened lunar surface.

 







Scientists already are analyzing the downlinked images, audio, and data to refine the timing and locations of these events and compare them with observations from amateur astronomers. The new imagery also will help NASA better understand the Moon’s geology and inform future exploration and science missions that will lay the foundation for an enduring presence on the Moon ahead of future astronaut missions to Mars.

“It was remarkable listening to the crew describe the stunning views during the flyby,” said Jacob Bleacher, NASA’s chief exploration scientist at the agency’s headquarters. “At first, their descriptions didn’t quite match what we were seeing on our screens. Now that higher resolution images are coming down, we can finally experience the moments they were trying to share and truly appreciate the scientific return provided by these images and our other research on this mission.” 

Official NASA imagery for viewing and download is available on the agency website and digital platforms, including:

Media should follow NASA’s media usage guidelines for all publication and distribution of these images.

NASA is targeting 8:07 p.m. EDT (5:07 p.m. PDT) Friday, April 10, for the return of Artemis II off the coast of San Diego. NASA+ live return coverage begins at 6:30 p.m.  and will continue until NASA and Department of War personnel safely assist the crew out of Orion and transport them to the USS John P. Murtha.

Briefings, events, and 24/7 mission coverage are streaming on NASA’s YouTube channel and events will each have their own stream closer to their start time. Learn how to watch NASA content through a variety of online platforms, including social media.

As part of Golden Age of innovation and exploration, NASA will send Artemis astronauts on increasingly difficult missions to explore more of the Moon for scientific discovery, economic benefits, and to build on our foundation for the first crewed missions to Mars.

To learn more about the Artemis program, visit: https://www.nasa.gov/artemis 

Cheryl Warner / Katherine Rohloff
Headquarters, Washington

Source: NASA’s Artemis II Crew Beams Official Moon Flyby Photos to Earth - NASA

Deep-tech company develops high-precision passive eye-tracking technology for smart contact lenses - Consumer & Gadgets - Hi Tech & Innovation

Moiré-pattern eye-tracking label. It has four side-by-side sections made from two stacked layers of nano-stripe patterns. Credit: XPANCEO

XPANCEO, a deep-tech company developing smart contact lenses, has unveiled a passive eye-tracking system that achieves industry-level measurement precision using standard cameras. The system employs microscopic patterns embedded in contact lenses that enable high-accuracy passive gaze tracking without requiring active electronics or dedicated power sources.

This technology allows contact lenses to function as optical markers that can be read by existing cameras in laptops, vehicle dashboards, mobile devices, and helmet-mounted systems. The system uses two ultra-thin optical gratings that create interference patterns that shift as the eye rotates.

As the eye rotates and the viewing angle changes, the gratings (separated by a microscopic gap) shift relative to each other, similar to how layers in a pop-up book change position when tilted. This causes the so-called moiré patterns to undergo a measurable transformation. The tracking module measures 2.5 × 2.5 millimeters and is encapsulated in a biocompatible silicone elastomer, compatible with conventional contact lens manufacturing processes.

Current eye-tracking technologies mostly rely on external systems and work by shining infrared light onto the eye and using cameras to capture the reflection patterns from the cornea and sometimes the crystalline lens.

a) Contact lens with an integrated passive eye-tracking label, observed using an external camera module to measure lens orientation. b) Cross-section of the contact lens with the eye-tracking label. Credit: Advanced Functional Materials (2026). DOI: 10.1002/adfm.202522757

Computer vision algorithms then analyze these images by calculating corresponding gaze direction and processing the relative positions of multiple glints and the shape and position of the pupil. This continuous cycle of illumination, imaging, and analysis happens dozens of times per second.

These systems drain batteries relatively quickly and experience reduced performance in challenging lighting conditions, including well-lit environments where infrared signals compete with the ambient light.

How the passive pattern system works

The new pattern-based technology offers two key advantages. First, the simplified setup eliminates the need for infrared illumination and works reliably in well-lit environments, reducing hardware complexity and power consumption.

Second, it enables universal deployment. Since cameras are already embedded in everyday devices and environments, the passive tracking system functions across multiple contexts without requiring dedicated infrastructure.

The research has been published in Advanced Functional Materials.

"This moiré pattern approach provides accurate eye orientation measurement using optical geometry without adding complexity or energy requirements to the lens," said Dr. Valentyn Volkov, Founder and CTO of XPANCEO.

"The technology extends the potential applications of contact lens platforms, particularly in environments where users are already interfacing with camera-equipped devices."

Medical and high-risk environment uses

This unique 0.3-degree precision, without the need for restrictive clinical hardware, makes the system a promising solution to detect subtle eye movements in clinical applications, including the study of patterns associated with neurological conditions. Such high-fidelity eye-tracking is increasingly recognized as a vital biomarker for the early diagnosis of neurodegenerative conditions, including Parkinson's and Alzheimer's diseases, with recent research establishing specific protocols for diagnosis.

Furthermore, the system's robustness makes it highly adaptable to extreme and high-stakes environments. In automotive, aerospace, or industrial settings, where users often wear helmets with embedded cameras, the continuous analysis of saccadic velocity and micro-fixations goes far beyond standard fatigue monitoring.

It enables the real-time detection of severe central nervous system fatigue, cognitive impairment, or intoxication, ensuring that operators are fully capable of performing their duties.

This technology expands the applications of smart contact lenses without increasing the system complexity. 

Provided by XPANCEO

Source: Deep-tech company develops high-precision passive eye-tracking technology for smart contact lenses