Cameras capturing fragile deep-sea jellies in their element

[ad_1]

Open an expedition In August 2021, MBARI with the Schmidt Ocean Institute off the coast of San Diego sent a pair of instruments with a special DNA sampling apparatus hundreds of meters deep to explore the midwaters. The researchers used the cameras to scan at least two unnamed creatures, a novel ctenophore and a siphonophore.

Successful scans strengthen the case for virtual holotypes, which are digital rather than physical specimens that can serve as the basis for a species definition where collection is not possible. Historically, the holotype of a species has been a physical specimen meticulously caught, preserved, and cataloged, such as an anglerfish swimming in a formaldehyde jar, a pressed fern in a Victorian book, or an insect pinned to the wall of a natural history museum. Future researchers can learn from these and compare them with other examples.

Fans say virtual holotypes like 3D models are our best chance to document the diversity of marine life, some of which are on the verge of disappearing forever. Without a species designation, scientists cannot monitor populations, identify potential hazards, or push for conservation measures.

“The ocean is changing rapidly: increasing temperatures, decreasing oxygen, acidification,” he says. Allen Collinsis a jelly expert with dual appointments at the National Oceanic and Atmospheric Administration and the Smithsonian National Museum of Natural History. “There are still hundreds of thousands, perhaps millions, of species to be named, and we can’t afford to wait.”

four sizes of jelly

Marine scientists who study gelatinous mid-water creatures all have horror stories of potentially new species disappearing before their eyes. Collins remembers trying to photograph the ctenophores in the wet lab of a NOAA research vessel off the Florida coast: “Within a few minutes, due to heat, light, or pressure, they began to break down,” he says. “The pieces are just starting to come out. It’s been a horrible experience.”

Kakani KatijaA bioengineer at MBARI and the driving force behind DeepPIV and EyeRIS didn’t set out to solve the mid-water picker’s headache. “DeepPIV was developed to look at fluid physics,” he explains. In the early 2010s, Katija and her team He was studying how sea sponges glide and looking for a way to track the movement of water by recording the three-dimensional positions of tiny particles suspended in it.

They then realized that the system could be used to non-invasively screen animals with gelatin. Using a powerful laser mounted on a remote-controlled vehicle, DeepPIV simultaneously illuminates a section of the creature’s body. “We have a video, and each video frame ends up being one of the images from our stack,” he says. Joost Daniels, an engineer working in Katija’s lab to develop DeepPIV. “And once you have a stack of images, it’s not that different from how people would analyze CT or MRI scans.”

As a result, DeepPIV produces a still 3D model – but marine biologists were keen to observe mid-water creatures in action. So Katija, MBARI engineer Paul Roberts and other members of the team created a lightfield camera system called EyeRIS that detects the precise direction as well as the intensity of light in a scene. A microlens array between the camera lens and the image sensor divides the field into multiple images, much like a housefly multi-segment vision.

EyeRIS’s raw, unprocessed images look like they do when you take off your 3D glasses during a movie; multiple offset versions of the same object. But once sorted by depth, the images are transformed into precisely rendered three-dimensional videos, allowing researchers to observe behavior and fine-scale locomotive movements (jellys are experts in jet propulsion).

What is the value of a picture?

For decades, researchers have occasionally tried to describe new species without a traditional holotype. South African bee fly using only high-resolution photos, mysterious owl with photos and call logs. Doing so may incur the wrath of some scientists: for example, in 2016 hundreds of researchers signed a letter advocating the sanctity of the traditional holotype.

But in 2017, the International Commission on Zoological Nomenclature, the governing body that published the code that determines how species should be identified, published a statement on its rules, stating: where collection is not possible, new species can be characterized without a physical holotype.

In 2020, a team of scientists including Collins described a new type and strain of comb jelly based on high-definition video. (Duobracium sparks, as baptized, looks like something translucent Thanksgiving turkey with streamers gushing from drumsticksIn particular, there was no grunt from the taxonomist peanut gallery—a win for proponents of digital holotypes.

Collins says the MBARI team’s visualization techniques only strengthen the case for digital holotypes because they are closer to the detailed anatomical studies that the scientists are conducting on physical samples.

A parallel movement to digitize existing physical holotypes is also gaining momentum. Karen Osborn He is a researcher and curator of medium aquatic invertebrates. annelids and peraccharidesAnimals that are much more substantial and easier to collect than mid-water jellies at the Smithsonian National Museum of Natural History. Osborn says the pandemic has underlined the usefulness of high-quality digital holotypes. Countless field expeditions were interrupted due to travel restrictions, and annelid and peracarid researchers “couldn’t get in. [to the lab] and look at any examples,” he explains, so they can’t currently describe anything from physical types. But work is booming through the digital collection.

Using a micro-CT scanner, Smithsonian scientists have given researchers around the world access to holotype samples in the form of “3D reconstructions in minute detail.” When he receives a sample request, which typically involves mailing the priceless holotype at risk of damage or loss, Osborn says he offers to send a virtual version first. While most researchers are initially skeptical, “certainly, we always say ‘Yes, I don’t need a sample. I have all the information I need.”

“EyeRIS and DeepPIV give us an even cooler way to document events in situ,” adds Osborn. During his research trips, he saw the system in action. giant larvaeTiny invertebrates with intricate “slime palaces” secreted by scientists had never been able to function completely intact until DeepPIV.

Katija says the MBARI team is considering ways to gamify genre identification. fold itA popular citizen science project in which “players” use a video game-like platform to determine the structure of proteins.

In the same spirit, citizen scientists can help analyze images and scans taken by ROVs. “Pokémon Go had people roaming their neighborhoods looking for fake stuff,” Katija says. “Can we harness this energy and have people looking for things that science doesn’t know?”

Elizabeth Anne Brown is a science journalist based in Copenhagen, Denmark.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *