Saturday, December 31, 2022

Watch the Latest Water Satellite Unfold Itself in Space - UNIVERSE


Two cameras aboard the Surface Water and Ocean Topography (SWOT) satellite captured the large mast and antenna panels of the spacecraft’s main science instrument deploying over four days, a process that was completed on Dec. 22, 2022. The masts, which unfold from opposite sides of the spacecraft, can be seen extending out from the spacecraft and locking in place, but the cameras stopped short of capturing the antennas at the ends of the masts being fully deployed (a milestone the team confirmed with telemetry data). This video places the two camera views side by side. Credits: NASA/JPL-Caltech/CNES

Cameras on the Surface Water and Ocean Topography spacecraft captured the antennas for its main science instrument unfurling in orbit.

But before it can do that, the satellite would need to unfold its large mast and antenna panels (see above) after successfully deploying the solar panel arrays that power the spacecraft. The mission monitors and controls the satellite using telemetry data, but it also equipped spacecraft with four customized commercial cameras to record the action.

The solar arrays fully deployed shortly after launch, taking about 10 minutes.

The Surface Water and Ocean Topography (SWOT) satellite deployed its solar arrays while in Earth orbit.

The antennas successfully deployed over four days, a process that was completed on Dec. 22. The two cameras focused on the KaRIn antennas captured the mast extending out from the spacecraft and locking in place but stopped short of capturing the antennas being fully deployed (a milestone the team confirmed with telemetry data.)

Thirty-three feet (10 meters) apart, at either end of the mast, the two antennas belong to the groundbreaking Ka-band Radar Interferometer (KaRIn) instrument. Designed to capture precise measurements of the height of water in Earth’s freshwater bodies and the ocean, KaRIn will see eddies, currents, and other ocean features less than 13 miles (20 kilometers) across. It will also collect data on lakes and reservoirs larger than 15 acres (62,500 square meters) and rivers wider than 330 feet (100 meters) across.

KaRIn will do this by bouncing radar pulses off the surface of water on Earth and receiving the signals with both of those antennas, collecting data along a swath that’s 30 miles (50 kilometers) wide on either side of the satellite.

The data SWOT provides will help researchers and decision-makers address some of the most pressing climate questions of our time and help communities prepare for a warming world.

This illustration shows the SWOT spacecraft with its antenna mast and solar arrays fully deployed. Credits: NASA/JPL-Caltech

More About the Mission

SWOT was jointly developed by NASA and the French space agency Centre National d’Études Spatiales (CNES), with contributions from the Canadian Space Agency (CSA) and the UK Space Agency. JPL, which is managed for NASA by Caltech in Pasadena, California, leads the U.S. component of the project. For the flight system payload, NASA is providing the Ka-band Radar Interferometer (KaRIn) instrument, a GPS science receiver, a laser retroreflector, a two-beam microwave radiometer, and NASA instrument operations. CNES is providing the Doppler Orbitography and Radioposition Integrated by Satellite (DORIS) system, the dual frequency Poseidon altimeter (developed by Thales Alenia Space), the KaRIn radio-frequency subsystem (together with Thales Alenia Space and with support from the UK Space Agency), the satellite platform, and ground operations. CSA is providing the KaRIn high-power transmitter assembly. NASA is providing the launch vehicle and the agency’s Launch Services Program, based at Kennedy Space Center, is managing the associated launch services.

To learn more about SWOT, visit: https://swot.jpl.nasa.gov/

Source: Watch the Latest Water Satellite Unfold Itself in Space | NASA


Researchers use 3D bioprinting to create eye tissue


The outer blood-retina barrier is the interface of the retina and the choroid, including Bruch's membrane and the choriocapillaris. Image credit: National Eye Institute. Credit: National Eye Institute

Scientists used patient stem cells and 3D bioprinting to produce eye tissue that will advance understanding of the mechanisms of blinding diseases. The research team from the National Eye Institute (NEI), part of the National Institutes of Health, printed a combination of cells that form the outer blood-retina barrier—eye tissue that supports the retina's light-sensing photoreceptors. The technique provides a theoretically unlimited supply of patient-derived tissue to study degenerative retinal diseases such as age-related macular degeneration (AMD).

"We know that AMD starts in the outer blood-retina barrier," said Kapil Bharti, Ph.D., who heads the NEI Section on Ocular and Stem Cell Translational Research. "However, mechanisms of AMD initiation and progression to advanced dry and wet stages remain poorly understood due to the lack of physiologically relevant human models." The outer blood-retina barrier consists of the retinal pigment epithelium (RPE), separated by Bruch's membrane from the blood-vessel rich choriocapillaris. Bruch's membrane regulates the exchange of nutrients and waste between the choriocapillaris and the RPE. In AMD, lipoprotein deposits called drusen form outside Bruch's membrane, impeding its function. Over time, the RPE break down leading to photoreceptor degeneration and vision loss.

Bharti and colleagues combined three immature choroidal cell types in a hydrogel: pericytes and endothelial cells, which are key components of capillaries; and fibroblasts, which give tissues structure. The scientists then printed the gel on a biodegradable scaffold. Within days, the cells began to mature into a dense capillary network.

NIH researchers used 3D bioprinting to create eye tissue: Technique provides model for studying genesis of age-related macular degeneration and other eye diseases. Credit: National Eye Institute

On day nine, the scientists seeded retinal pigment epithelial cells on the flip side of the scaffold. The printed tissue reached full maturity on day 42. Tissue analyses and genetic and functional testing showed that the printed tissue looked and behaved similarly to native outer blood-retina barrier. Under induced stress, printed tissue exhibited patterns of early AMD such as drusen deposits underneath the RPE and progression to late dry stage AMD, where tissue degradation was observed. Low oxygen induced wet AMD-like appearance, with hyperproliferation of choroidal vessels that migrated into the sub-RPE zone. Anti-VEGF drugs, used to treat AMD suppressed this vessel overgrowth and migration and restored tissue morphology.

"By printing cells, we're facilitating the exchange of cellular cues that are necessary for normal outer blood-retina barrier anatomy," said Bharti. "For example, presence of RPE cells induces gene expression changes in fibroblasts that contribute to the formation of Bruch's membrane—something that was suggested many years ago but wasn't proven until our model." Among the technical challenges that Bharti's team addressed were generating a suitable biodegradable scaffold and achieving a consistent printing pattern through the development of a temperature-sensitive hydrogel that achieved distinct rows when cold but that dissolved when the gel warmed. Good row consistency enabled a more precise system of quantifying tissue structures. They also optimized the cell mixture ratio of pericytes, endothelial cells, and fibroblasts.

Co-author Marc Ferrer, Ph.D., director of the 3D Tissue Bioprinting Laboratory at NIH's National Center for Advancing Translational Sciences, and his team provided expertise for the biofabrication of the outer blood-retina barrier tissues "in-a-well," along with analytical measurements to enable drug screening.


The eye's outer blood-retina barrier comprises retinal pigment epithelium, Bruch's membrane and the choriocapillaris. Image credit: National Eye Institute. Credit: National Eye Institute

"Our collaborative efforts have resulted in very relevant retina tissue models of degenerative eye diseases," Ferrer said. "Such tissue models have many potential uses in translational applications, including therapeutics development."

Bharti and collaborators are using printed blood-retina barrier models to study AMD, and they are experimenting with adding additional cell types to the printing process, such as immune cells, to better recapitulate native tissue.

by National Eye Institute

Source: Researchers use 3D bioprinting to create eye tissue (medicalxpress.com)

Free Online Yoga Class with Hudson Leick




 

Hilarious CCTV Fails Of 2022 | FailArmy

 

Short Film - You Missed a Spot (2020) - Comedy - Horror

 

John Krasinski Channels Alec Baldwin and Harrison Ford to Play Realistic Superhero Jack Ryan - IMDb

 

Behind the scenes footage - Kristen Stewart at Thunder Studios (COME SWIM)


 

Funny and Weird Clips (2850)



















 

Friday, December 30, 2022

NASA Developing AI to Steer Using Landmarks – On the Moon - UNIVERSE

Much like how familiar landmarks can give travelers a sense of direction when their smart phones lose their lock on GPS signals, a NASA engineer is teaching a machine to use features on the Moon’s horizon to navigate across the lunar surface.

“For safety and science geotagging, it’s important for explorers to know exactly where they are as they explore the lunar landscape,” said Alvin Yew, a research engineer at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Equipping an onboard device with a local map would support any mission, whether robotic or human.”

The collection of ridges, craters, and boulders that form a lunar horizon can be used by an artificial intelligence to accurately locate a lunar traveler. A system being developed by Research Engineer Alvin Yew would provide a backup location service for future explorers, robotic or human. Credits: NASA/MoonTrek/Alvin Yew

NASA is currently working with industry and other international agencies to develop a communications and navigation architecture for the Moon. LunaNet will bring “internet-like” capabilities to the Moon, including location services.

However, explorers in some regions on the lunar surface may require overlapping solutions derived from multiple sources to assure safety should communication signals not be available.

“It’s critical to have dependable backup systems when we’re talking about human exploration,” Yew said. “The motivation for me was to enable lunar crater exploration, where the entire horizon would be the crater rim.”

Yew started with data from NASA’s Lunar Reconnaissance Orbiter, specifically the Lunar Orbiter Laser Altimeter (LOLA). LOLA measures slopes, lunar surface roughness, and generates high resolution topographic maps of the Moon. Yew is training an artificial intelligence to recreate features on the lunar horizon as they would appear to an explorer on the lunar surface using LOLA’s digital elevation models. Those digital panoramas can be used to correlate known boulders and ridges with those visible in pictures taken by a rover or astronaut, providing accurate location identification for any given region.

“Conceptually, it’s like going outside and trying to figure out where you are by surveying the horizon and surrounding landmarks,” Yew said. “While a ballpark location estimate might be easy for a person, we want to demonstrate accuracy on the ground down to less than 30 feet (9 meters). This accuracy opens the door to a broad range of mission concepts for future exploration.”

Making efficient use of LOLA data, a handheld device could be programmed with a local subset of terrain and elevation data to conserve memory. According to work published by Goddard researcher Erwan Mazarico, a lunar explorer can see at most up to about 180 miles (300 kilometers) from any unobstructed location on the Moon. Even on Earth, Yew’s location technology could help explorers in terrain where GPS signals are obstructed or subject to interference.

Yew’s geolocation system will leverage the capabilities of GIANT (Goddard Image Analysis and Navigation Tool). This optical navigation tool developed primarily by Goddard engineer Andrew Liounis previously double-checked and verified navigation data for NASA’s OSIRIS-REx mission to collect a sample from asteroid Bennu (see CuttingEdge, Summer 2021).

In contrast to radar or laser-ranging tools that pulse radio signals and light at a target to analyze the returning signals, GIANT quickly and accurately analyzes images to measure the distance to and between visible landmarks. The portable version is cGIANT, a derivative library to Goddard’s autonomous Navigation Guidance and Control system (autoGNC) which provides mission autonomy solutions for all stages of spacecraft and rover operations.

Combining AI interpretations of visual panoramas against a known model of a moon or planet’s terrain could provide a powerful navigation tool for future explorers.


By Karl B. Hille

NASA’s Goddard Space Flight Center in Greenbelt, Md.

Source: Steering by Landmarks – On the Moon | NASA

The Insane Biology of: Sloths - Real Science

 

5-minute Guided Mediation with Jon Kabat-Zinn | MasterClass

 

TRY NOT TO LAUGH WATCHING FUNNY FAILS VIDEOS 2022 #253 - Daily Dose of Laughter

 

Short Film - Tiffani Thiessen is Busy - Funny Or Die - Comedy

 

Danny Trejo Breaks Down His Most Iconic Characters | GQ

 

Margot Robbie Bloopers and Cute on Set Moments - TheThings

 

Funny and Weird Clips (2849)