Press "Enter" to skip to content

A sweet father-son bond inspires tasty new molecule models

Thirteen-year-old Noah Shaw loves planets and has perfect pitch. He wants to be a scientist like his father Bryan Shaw, a biochemist at Baylor University in Waco, Texas. But Noah’s path to science may not be as smooth as it was for the elder Shaw.

Diagnosed with retinoblastoma as an infant (SN: 1/5/85), Noah now has only one eye and permanent blind spots in his vision. People with one eye, like Noah, and people who have blindness or limited vision, are underrepresented in science and face barriers in STEM education. “Most of the stunning imagery in science is inaccessible to people who are blind,” Bryan Shaw says. That makes him wistful because renderings of proteins hooked him on science.

In an effort to help make science more inclusive, Shaw and his colleagues have come up with bite-sized molecule models that take advantage of the mouth’s supersensitive touch sensors, which can perceive finer details than our fingertips can.

Brian Shaw and his son Noah, sitting on a curb outside, smiling at the camera
Biochemist Bryan Shaw (left) — inspired by his son Noah (right) whose vision was affected by cancer — created edible and nonedible models of proteins that students can explore with their mouths.Courtesy of Elizabeth Shaw

The team created gummy candy models of important proteins, including myoglobin, which provides oxygen to muscles, and also 3-D printed nonedible, nontoxic versions (SN: 3/16/15). Both can be popped in the mouth for investigation. Once the researchers attached lanyards to the nonedible models to prevent choking, the team tested how well 281 college students and 31 grade schoolers could tell edible or nonedible models apart while blindfolded.

Each student examined one protein model either by mouth or by hand. For every additional protein model that the students assessed, they had to determine whether the protein was the same as the first or different. A separate group of 84 college students did the test by eyesight with 3-D computer images of proteins instead of models.

Students correctly identified the proteins about 85 percent of the time, regardless of whether they used their mouths, fingers or eyes, the team reports May 28 in Science Advances. Such cheap, tiny models could help students learn about proteins regardless of vision acuity, Shaw says.

Shaw got the idea for this would-be educational tool while twirling a blackberry on his tongue. A blackberry’s bumpy exterior looks like a popular way that scientists depict proteins, in which each of the protein’s atoms is represented by a sphere. Stick thousands of atoms together, and the conglomerate resembles an elaborate berry — something the tongue might be able to tell apart by shape.

Many infants and toddlers explore the world by mouth. A student in Hong Kong made headlines in 2013 for teaching herself to read Braille with her lips. Yet the mouth’s remarkable sensing ability remains largely untapped in science education, Shaw says.

Shaw has patented the models and is eager for feedback. But taking the models from prototype to teaching tool will require more work. For instance, the researchers have access to professional equipment to print models and sterilize them between uses — something not all educators have.

Most importantly, the models would benefit from testing by students who are blind and those who have low vision. Input from these students will help Shaw’s team improve the models to better fit the students’ needs. Shaw has initiated conversations about the models with educators at the Texas School for the Blind and Visually Impaired in Austin. Noah did test the models, but the researchers didn’t include his data in the analysis.

This is not the first time that Noah has inspired his dad. Shaw previously codeveloped an app that has the potential to catch early signs of eye disease in childhood pictures. Regardless of whether Noah pursues science, his father has one wish: “I hope he does something cool.”

Source: Science News

All rights reserved © Adeum, 2020