2
   

Will "In Silico" Biology Replace Traditional Laboratory Experimentation?

 
 
Reply Thu 21 Oct, 2010 01:53 pm
Can the study of living systems be carried out entirely through the computational tools of computer modeling?

Here is an example:
Quote:
New Computational Tool for Cancer Treatment
(ScienceDaily, Feb. 4, 2010)

Many human tumors express indoleamine 2,3-dioxygenase (IDO), an enzyme which mediates an immune-escape in several cancer types. Researchers in the Molecular Modeling group at the SIB Swiss Institute of Bioinformatics and Dr. BenoƮt J. Van den Eynde's group at the Ludwig Institute for Cancer Research Ltd (LICR) Brussels Branch developed an approach for creating new IDO inhibitors by computer-assisted structure-based drug design. The study was presented in the January 2010 online issue of the Journal of Medicinal Chemistry.

The docking algorithm EADock, used for this project, was developed by the Molecular Modeling Group over the last eight years. It provides solutions for the "lock-and-key" problem, wherein the protein active site is regarded as a "lock," which can be fitted with a "key" (usually a small organic molecule) able to regulate its activity. Once an interesting molecule has been obtained, synthesis and laboratory experiments are necessary to confirm or reject the prediction. This algorithm will soon be made available to the scientific community worldwide.

The scientists obtained a high success rate. Fifty percent of the molecules designed in silico were active IDO inhibitors in vitro. Compounds that displayed activities in the low micromolar to nanomolar range, made them suitable for further testing in tumor cell experiments and for in vivo evaluation in mice. If these studies are successful, scientists can begin evaluating these new compounds in patients undergoing cancer-immunotherapy.

According to Olivier Michielin, Assistant Member at the Lausanne Branch of LICR and leader of the SIB Swiss Institute of Bioinformatics Molecular Modeling group, "This is a satisfactory proof of principle showing that computational techniques can produce very effective inhibitors for specific cancer targets with high yield. This is very encouraging for future drug developments in the academic environment."
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 2 • Views: 983 • Replies: 3
No top replies

 
talk72000
 
  1  
Reply Thu 21 Oct, 2010 03:11 pm
@wandeljw,
There are advantages and disadvantages. Speed could be an advantage as some biological processes are slow and conditional. At the same time the biological process is the real deal so simulation can never replace reality as afterall the final product has to be used in the real world.
farmerman
 
  1  
Reply Thu 21 Oct, 2010 08:12 pm
@talk72000,
I saw an example of some "feedback" models that they use in some med schools for surgery. This supplants much of the pig and monky operations and gets the students ready for their actual attempts at living tissues.
The costs and difficulty in acquiring living tissue has given some med schools a pause to insert a lot of technology up front so that gross clinical observations can be taught uniformly and without wsting pecious cadavers and tissue samples.

Theres really nothing wrong with using technology and, like the growth of modelling outputs and CGI capabilitioes, weve come a really long way in less than 10 years. I recall that , 10 years ago,CGI representations of various tissues and textures were poor to middling . Now, think about how the artists can render stuff like hair and skin just for movies.
As far as process modelling, many subjects now use fishnet (finite element) modelling to show 3D representations of structure, movement, and expansion speed. The power that we have in desktops can make modelling available to all kinds of technicians.

Any system that can be described mathematically, can be modeled. The finer the modelling, the more iterative functions involved. Its so easy to do models of many natural systems on a step by step basis. These kinds of models took huge computers 20 years ago, They took mainframes until about 15 years ago. Now, any kid with 4 terabytes of HD can run many "SYM" functions and natural (including biological" systems.


MY only concern, and Ive seen it happen in my field. The kids start losing touch with the calculus functions and linear algebra that got them there.

0 Replies
 
wandeljw
 
  1  
Reply Fri 22 Oct, 2010 02:22 pm
There is a new discipline called "data intensive science" which organizes large volumes of data from multiple sources. Analysis of the data is carried out through visualizations, simulations, and other kinds of digital modeling. Microsoft offers a free pdf of essays about the emerging field of "eScience":

http://research.microsoft.com/en-us/collaboration/fourthparadigm/
0 Replies
 
 

Related Topics

New Propulsion, the "EM Drive" - Question by TomTomBinks
The Science Thread - Discussion by Wilso
Why do people deny evolution? - Question by JimmyJ
Are we alone in the universe? - Discussion by Jpsy
Fake Science Journals - Discussion by rosborne979
Controvertial "Proof" of Multiverse! - Discussion by littlek
 
  1. Forums
  2. » Will "In Silico" Biology Replace Traditional Laboratory Experimentation?
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 05/05/2024 at 11:42:14