BMClogo

MIT researchers have developed a new theoretical framework to study the mechanisms of therapeutic interactions. Their approach enables scientists to effectively estimate how a combination of treatments will affect a set of units, such as cells, allowing researchers to perform less cost experiments while collecting more accurate data.

For example, to study how interconnected genes affect cancer cell growth, biologists may need to use a combination of one treatment to target multiple genes simultaneously. However, since there may be billions of potential combinations per round of experiments, selecting a portion of the combinations to test data that may be biased towards their experiments.

In contrast, the new framework considers the situation where users can effectively design unbiased experiments by assigning all treatments in parallel, and results can be controlled by adjusting each treatment rate.

In theory, MIT researchers demonstrated a near-optimal strategy in this framework and conducted a series of simulations to test it in multiple experiments. Their method minimizes the error rate in each instance.

This technology could one day help scientists better understand the mechanisms of the disease and develop new drugs to treat cancer or genetic diseases.

“We have introduced the concepts people can think more when studying the best way to choose combination therapy in each round of experiments. We hope one day can be used to solve problems related to biology.”

MIT undergraduate co-leader writer Divya Shyamal (Divya Shyamal) joined the paper; and EECS’ Andrew and Erna Viterbi Professor of Engineering and Senior Writer Caroline Uhler of MIT Data, Systems and Society (IDSS), who is also director of the Center for Eric and Wendy Schmidt and also the Information and Decision Systems (LIDS) of MIT Researchers. The study was recently presented at the International Machine Learning Conference.

Treatment at the same time

Treatments can interact with each other in complex ways. For example, scientists trying to determine whether a gene helps a particular disease symptoms may have to target several genes simultaneously to study this effect.

To do this, scientists use so-called combinatorial perturbations, where they apply multiple treatments to the same group of cells at once.

“Combination perturbation will provide you with a high-level network of understanding the interactions of different genes, which provides an understanding of cellular function,” Zhang explained.

Since genetic experiments are expensive and time-consuming, scientists aim to select the best treatment combination for testing, which is a huge challenge due to a large number of possibilities.

Selecting a suboptimal subset can produce biased results by focusing only on pre-selected combinations by the user.

MIT researchers solved this problem in different ways by looking at the probability framework. Instead of focusing on the selected subset, each unit was randomly ingested a combination of treatments based on the user-specified dose levels for each treatment.

Users set dose levels based on the experiment’s goals – perhaps the scientist hopes to study the effects of four drugs on cell growth. The probability method produces less data because it does not limit the experiment to a predetermined subset of treatments.

Dosage levels are like probability, and each cell receives a random combination of treatments. If the user sets a high dose, most cells are more likely to receive the treatment. If the dose is lower, a smaller subset of cells will receive this treatment.

“From there, the question is how do we design the dose so that we can estimate the results as accurately as possible? That’s where our theory lies.”

Their theoretical framework demonstrates the best way to design these doses so that one can learn the most about the characteristics or characteristics they are studying.

After each round of experiments, the user collects the results and feeds them back to the experimental framework. It will output an ideal dosage strategy for the next round and so on actively adapting multiple rounds of strategies.

Optimize dose and minimize errors

The researchers demonstrated that their theoretical approach produces the optimal dose, and each round of noise will vary even if the dose level is affected by a limited supply of treatment or noise in the experimental results.

In the simulation, this new method has the lowest error rate when comparing the estimates and actual results of multiple experimental experiments, and performs better than the two baseline methods.

In the future, researchers hope to enhance their experimental framework to consider interference between units and the fact that certain treatments can lead to selection bias. They also want to apply this technology in a real experimental environment.

“This is a new approach to a very interesting problem that is difficult to solve. Now, with this new framework, we can consider more the best way to design experiments for many different applications,” Zhang said.

The study was funded in part by the Advanced Undergraduate Research Opportunity Program at the MIT, Apple, National Institutes of Health, Office of Naval Research, the Department of Energy, ERIC and Wendy Schmidt Centers, and the Simons Investigator Award.

Source link