return to table of content

Does DNA have the equivalent of IF-statements, WHILE loops, or function calls?

roughly
29 replies
23h26m

Something to be aware of as a computer person coming into biology: Biology only _looks_ like it has logical constructs, abstraction layers, and general computational frameworks. When you start working with it, you'll find everything is messier than you expected and everything interacts with everything at every level. There are no abstractions, there are no actual boundaries, and your outputs are subject to evolutionary pressures even when sitting in a flask. Many an engineer has washed aground on the rocky shores of biological indeterminism - it's not a machine, no matter how much it kind of looks like one when you squint at it.

munificent
8 replies
23h20m

The I think about it is like this: Code is read and maintained by humans, so it is under selection pressure to be comprehensible by human minds.

Reality has no such constraint. If the biological processes of some animal lead to inscrutable irreducibly complex "spaghetti code" but the animal is slightly more likely to produce offspring... the complexity wins. The natural world doesn't care whether or not we can understand it.

(There is, of course, a natural selection pressure going the other direction. Brains that understand the natural world do seem to often be generally more likely to produce offspring, so there's evolutionary pressure for minds to understand the world, but not for the world to be understood by minds.)

bossyTeacher
2 replies
23h13m

Brains that understand the natural world do seem to often be generally more likely to produce offspring

I would argue that the evolutionary pressure only applies to a limited level of understanding of the physical world. Think how many kids people like Stephen Hawkings have and how many poorly educated people have.

sjfjsjdjwvwvc
1 replies
22h56m

On the other hand I would argue that evolutionary pressure works differently for modern humans in the 20th century and beyond than it did for the majority of human evolution.

Sure uneducated people now often have more kids due to less contraception options/knowledge and as old age insurance. But then again rich people have more kids than middle class because of well more money and everything that comes with it.

In any case all of this is anecdotal and emotion based because there is not much good data on this. Or maybe there is?

bossyTeacher
0 replies
22h50m

rich people have more kids than middle class

Do they? I am pretty sure that the vast majority of the human population is poor and not just "I can't afford an iPhone poor" but "I have no food on the table" poverty. The rich are but a tiny minority. In terms of reproduction, poor people win easily. Last time I check, rich people are more likely to be highly educated compare to anyone else and education is a predictor of reproduction rate (more education and less reproduction and less education and more reproduction).

roughly
0 replies
22h18m

The evolutionary pressure is a root here, but the problem isn't readability, it's the lack of abstraction.

The central dogma is DNA -> RNA -> protein, except that: 1. RNA can fold into a functional molecule 2. RNA can be spliced differently to make several different proteins 3. RNA can bind with proteins to create other functional units 4. The availability of amino acids and TRNA units within the cell affects the speed at which a protein is translated from RNA, which can affect its structure, which affects its function 5. The codon language is not guaranteed identical between organisms, which means the same RNA sequence can encode for different amino acid sequences in different organisms

And that's just in the transcription/translation machinery. That's what I mean when I say there's no abstraction: biology is a physical system, which means molecular & atomic interactions influence every level of every process.

nico
0 replies
23h11m

In some way, neural networks and newer ML models seem to be going in this direction. We are starting to get faster/better results with black boxes than with if-else style code

campbel
0 replies
23h11m

Yes! Optimal designs may not be "elegant" from the human perspective. Reminds me of these complex computer generated airframe designs https://www.bbc.com/future/article/20181129-the-ai-transform....

bossyTeacher
0 replies
22h58m

The natural world doesn't care

I always find this kind of phrasing problematic due to its implicit anthrophomorphic metaphors.

Whether spaguetti code or not, I think that people often just assume that we have the capacity to understand it give the period of time between now and when humans disappear (because of evolution, a meteorite, climate change or human intervention). I am highly skeptical. Maybe, whatever species emerges after us, might have a better chance but to them we might be no different than a homo erectus is to us homo sapients.

allemagne
0 replies
22h47m

Code is read and maintained by humans, so it is under selection pressure to be comprehensible by human minds.

I think the opposite is actually true for code that lives in long-running projects that change hands relatively often. It seems like the safest state for code to be in is actually to be incomprehensible, so it's too "scary" for any developer to refactor. All the comprehensible code can meanwhile be safely moved around until it's eventually incomprehensible as well.

Speaking from complete naivete about biology, I wonder if there's a similar game going on with viruses and cell machinery. If the cell's processes are too "legible", then presumably they're easier for a virus to evolve to hijack.

bossyTeacher
6 replies
23h9m

I would argue that it is only messy because we are looking at the wrong level of abstraction with an incorrect computational model. At the end of the day, if it has inputs and produces outputs, it is a computing device (i.e. a computer).

__loam
5 replies
23h2m

Seems like an overly broad definition. A pizza oven has inputs and outputs but you would probably struggle to find someone willing to call it a computer.

bossyTeacher
4 replies
22h55m

Does it? A button is not equivalent to an input anymore than pulling down a level that initiates a process that culminates in the lighting of a candle.

BenjiWiebe
3 replies
22h43m

You put pizzas in and get pizzas out.

bossyTeacher
2 replies
22h15m

Just because you placed an object x inside an object y does not make y a computer -_- Computing is about data

__loam
1 replies
22h7m

We're using the definition you gave.

bossyTeacher
0 replies
22h2m

I said input though not "anything"

sjducb
3 replies
23h19m

As a biologist who moved into software engineering I love working with actual machines.

You can think of a design, build it and have it work as you expect. You can’t do that in biology.

The reason is that we “understand” how computers work. Our understanding of biology is nowhere near that level. A high percentage of established knowledge is wrong or so incomplete that it’s not useful for designing new systems. This is hard for engineers or even scientists from physics and chemistry to accept.

roughly
1 replies
22h11m

I think another reason is that biological systems are _fantastically_ complicated - for example, we're finding that the spatial distribution of molecules within the cell affect enzymatic processes in ways that may have implications for alzheimer's and dementia. For another, there's various bacterial communities where, if you want to trace the enzymatic pathway for a particular product, you're jumping between species in the community, and not just once - that particular product is the result of multiple distinct enzymatic pathways in multiple different organisms (and whatever else exists in the environment outside the cell).

I've been a software engineer for 20 years now, I don't underestimate the complexity of computer systems or software - I think the number of people in the world who could really accurately describe every part of every system involved in showing you this note to you is maybe ten - but we've got nothing on biological systems.

sjducb
0 replies
8h2m

I agree that biology is fantastically complicated.

When I say we “understand” everything that matters about a computer; I don’t mean one person understands. I mean each important part is understood by at least one person. Together humanity understands everything important about a computer.

pezo1919
0 replies
21h50m

Wondering if there would be an "easier" way to express something in biology/evolution, then the evolution would (already) compressed it like that.

Maybe by definition: current existing things in evolution are not "more compressable" (for us) without losing meaning? (Their meaning could be estimated though through -leaky- abstractions?)

foobarian
1 replies
22h32m

So, basically like the legacy code base at any large enterprise :D

codethief
0 replies
21h52m

Years ago, someone here on HN compared evolution of the DNA to the evolution of large-scale code bases. This really stuck with me. One example that was brought up was: There is no single region in the DNA that's responsible for the human eye. More generally: The mapping between phenotype and genotype tends to be fuzzy and the individual role and impact of genes is often unclear. Now in a somewhat similar way, in large code bases the mapping between features and code can also become fuzzy, and the role of a single line of code can be surprising. There can be side effects and a ton of implicit/hidden dependencies across the code base, maybe even spooky action at a distance. As a result, product features will no longer be confined to individual code files or modules. Rather, their exact logic will spread out all across the code base, and a single line of code can affect many features at once.

alexchamberlain
1 replies
23h22m

Computers only look like they have windows, digital signals and logic gates. It's all just pixels, noisy voltages and imperfect silicon.

jakeinspace
0 replies
23h5m

Sure, but because of the transistor’s nonlinearity (cutoff and saturation regions, for mosfet at least), we get digital logic. Digital logic means that errors don’t accumulate, and that prevents complexity from infecting everything in the computer. We can put neat little abstraction boxes around parts of the SoC, and then do the same with our software.

Digital computers are only noisy compared to the simplest processes, like a low rate transmission line. Compared to biological or chemical processes, a digital computer is just ludicrously low-noise, because noise doesn’t propagate unless there is a major fault.

mgfist
0 replies
23h22m

Sounds like many a codebases I've worked on haha

ilamont
0 replies
22h53m

The assumption that biological systems can be easily programmed or engineered has led to a lot of wasted Silicon Valley capital, misguided movements like "biohacking" and outright fraud, ranging from bogus supplements to devices promising better physical or mental health (search for "quwave defender" on Amazon or Google for one particularly noxious example).

dullcrisp
0 replies
23h22m

So just like my codebase then

SilasX
0 replies
22h57m

Yep -- there was a thread a while about whether a biologist could reverse-engineer a radio, and I made the comment that electronics are designed to keep the various functions nicely decoupled, while biological systems aren't/don't, because they optimize for something else, leading them to bleed state across the entire system:

https://news.ycombinator.com/item?id=32048972

(Also links to previous discussions on this theme.)

Affric
0 replies
20h13m

Biology being my first love I think what really leads non-biology based scientists, engineers, designers, artists, and people generallyastray is that they can’t liberate their mind from their top down all seeing perspective.

Teleological matter are the machines you get when the vast majority of the form cannot be designed.

visarga
27 replies
1d8h

Well, DNA might not have programming structures, but it acts like a neural network. See "Gene regulatory network"

Some proteins though serve only to activate other genes, and these are the transcription factors that are the main players in regulatory networks or cascades. By binding to the promoter region at the start of other genes they turn them on, initiating the production of another protein, and so on. Some transcription factors are inhibitory. [1]

These networks are similar to neural networks in that they process information through a series of interconnected nodes (genes and proteins, in the case of GRNs) that influence each other's activity. The activation or inhibition of one gene by a transcription factor can trigger a cascade of events, much like the way neurons activate each other in a neural networks.

[1] https://en.m.wikipedia.org/wiki/Gene_regulatory_network

Garlef
18 replies
1d8h

Before anyone jumps to esoteric conclusions:

This is more likely a result of neural networks being very general: General in the sense that they are able to approximate/model a lot of dynamic systems that allow operadic composition.

fasterik
15 replies
1d7h

It's still an interesting fact that evolution seems to have figured out that neural networks are good at approximating arbitrary continuous functions. We could ask whether intelligent alien species are likely to use neural nets. In other words, are neural nets universal in some sense, or are they contingent to how life evolved on Earth?

gumballindie
11 replies
1d7h

Yes it was nature that figured out ai neural nets are efficient not humans implementing algorithms that found out that nature makes efficient use of neural nets. Aliens will first have to make contant and dully learn how to use neural nets to develop intelligence. A catch 22.

fasterik
10 replies
1d7h

What I mean by aliens using neural nets is that we don't know what percent of evolved intelligent species would have neural nets as an integral part of their cognition like we do. And we don't know what alternative paths there are to the evolution of intelligence.

gumballindie
9 replies
1d3h

I think the way life is built on earth is the rule throughout the universe. For any type of organic “computation” there needs to be something similar to a neural net. Perhaps a different “implementation” but it should be largely the same.

mewpmewp2
8 replies
1d1h

Yeah, the amazing thing though is how such a simple concept can emerge into a very complex behaviour and intelligence.

And the fact that we know that and now have proof in terms of intelligence of ChatGPT, is to me pretty clear that achieving AGI is possible if you crack the way to structure your neural network.

gumballindie
7 replies
1d1h

now have proof in terms of intelligence of ChatGPT

What.

CamperBob2
6 replies
1d

You are intelligent. But we could get an equally-intelligent reply out of ChatGPT. Ergo, it's also intelligent.

gumballindie
5 replies
1d

Yeah no, that's bullshit. It's like saying my laptop emits sound, meaning it can sing.

mensetmanusman
4 replies
19h56m

ChatGPT is a type of intelligence just like a dog or a bird is a type of intelligence.

gumballindie
3 replies
18h38m

Yup, the world has gone mad. Flat earthers, anti-vaxxers, now ai intelligence believers.

mensetmanusman
2 replies
17h57m

Not sure why saying artificial intelligence is a type of intelligence is controversial, it's in the name...

It's definitely not human intelligence if that's what you are wondering.

mewpmewp2
0 replies
17h47m

Hmm... What are you, an ATM machine believer?

gumballindie
0 replies
17h21m

Corporations tried selling us intelligent fridges for decades. It doesnt mean it was true. Nor does it mean a procedural text generator is intelligent. It performs well at emulating intelligence but it’s about as intelligent as a rock - it can be used intelligently but on its own it does nothing.

guyomes
2 replies
1d6h

From what I understand, Gene Regulatory Network [1] works very differently from Biological Neural Circuits [2]. Finally, Perceptron used in computers [3] is based on arithmetic operations and is again different from any of those two biological mechanisms. However, perceptron appears to be a good tool to model functions that depends on a large number of parameters.

What we may infer from that is that after a lot of time of evolution, functions depending on a lot of parameters appear. Yet Biological Neural Circuits seem quite bad at approximating simple continuous functions such as the multiplication between two numbers.

Gene Neural Network and Biological Neural Circuit are quite impressive structures considering their size, materials, and energy constraints. However if you allow bigger size, more energy, and faster conducting materials, it should be possible to do have faster modeling tools.

Moreover, to better take into account non-linear functions, my two cents is that Perceptrons could be further enhanced using piecewise polynomial functions instead of piecewise linear functions [4,5].

[1]: https://en.wikipedia.org/wiki/Gene_regulatory_network

[2]: https://en.wikipedia.org/wiki/Neural_circuit

[3]: https://en.wikipedia.org/wiki/Perceptron

[4]: https://doi.org/10.1109/TPAMI.2021.3058891

[5]: https://doi.org/10.1109/TPAMI.2022.3231971

mewpmewp2
1 replies
1d2h

To me it seems like the fundamental idea is still the same. You have nodes that are connected to each other with strengths that can change according to feedback and complicated logic emerges from that.

That seems like almost fundamental emergent idea that is behind all the intelligence in the World.

It seems to allow for intelligence to occur, without this type of concept everything would just be random and chaotic.

The point is, how amazing it is that complicated behaviour can arise from this simple idea.

At this point it seems, that neural networks should really be taught at schools as soon as possible.

guyomes
0 replies
20h54m

That seems like almost fundamental emergent idea that is behind all the intelligence in the World.

That is indeed quite nice, yet intelligence seems much more diverse in biological organisms and not always reduced to nodes and edge strengths. The nodes and edges are a way to store data and to modify it that seems particularly well adapted mechanisms based on electricity.

On the other hand, there exists other very different (and slower) biological intelligent mechanisms that are not based on electricity. For example, a tree is perfectly adapted to its environment, capable of taking energy from the sun, materials from the ground, transform it, and so on. Yet the intelligent mechanism that created trees is not based on neural networks as far as I understand it. In this mechanism, the data is stored in DNA, and DNA can be rewriten at each new generation. It is a completely different (and slower) approach.

Finally I agree that the emergence of fast intelligence through neural network is incredible, although for me the first impressive advance is the use of the electricity to speed up the information exchange. The arrangements in nodes and edges could follow naturally from the fact that the transmission of electric signals works better through 1D circuits. The second impressive advance is the way the networks are layered. This is critical and very difficult to have an arrangement of edges and nodes that can both be trained and inferred efficiently. Without the proper arrangement, a biological neural network or a computer circuit is much less intelligent.

Finally in the future we could imagine intelligent mechanisms based on quantum mechanics for example. In this case, it would probably be vastly different from nodes and edges, due to the underlying physical constraints that are different in electricity and in quantum mechanics.

tldr: yes nodes and edges seem fundamental in fast intelligence based on electricity, but other very different intelligent mechanism exist in biological organisms, and other very different intelligent mechanisms could still be invented in the future.

petsfed
0 replies
1d

Just a sidebar here, I think the urge to jump to esoteric conclusions is very strong because we are (rightly) trained to not expect that nature at large uses the same tools that we have developed. So its very validating (and also kind of unnerving) when we do find the suggestion that we're "right" at a more fundamental level.

Consider that the uncertainty relations in physics fall neatly out of Fourier Analysis, which could suggest that the universe also does Fourier Analysis to figure out where things are and how fast they're going. But really, its that classical interaction is a sort of mechanical Fourier transform.

I think that moment where it feels like we brush against the real clockwork of the universe has an eldritch, cosmic-horror quality to it, which is both addictive and maddening.

namaria
0 replies
20h42m

Sure, nothing profound about trillion-strong fleets of self replicating nanobots controlled by chemical computers processing high dimension data about their environments with complex entanglements giving rise to specialized clusters of information processing neural masses, capable of then communicating with other similar entities, building knowledge about data processing and storage over millennia, starting with clay tablets and ending up with advanced machinery that can be used to create elaborate patterns of data flows that can help us peer through the universe, inside our bodies, understand in intricate detail everything that happens across billions of light years and over billions of years.

Nothing profound at all about any of that. Just very general structures.

stultissimus
5 replies
22h17m

Synthetic biologist here. This is a good answer, but looking at man-made genetic programs can give us a simplified perspective. We've made significant progress engineering programs that enable human T cells to kill target (cancer) cells _conditionally_ based on their surface marker expression [1]. This is in contrast to conventional CAR-T cells which are already on the market (see Kymriah, etc.). A simple AND gate T cell circuit is currently being tested in humans [2].

Numerous strategies now exist to write cellular decision making programs using synthetic circuits. We are entering an era where we can write DNA programs and put them into people; I wonder how we can interest more CS-minded people in this kind of "synthetic biology as programming", especially as we move from proof of concept studies in bacteria to real trials in humans [2].

[1]: https://pubmed.ncbi.nlm.nih.gov/33243890/ [2]: https://arsenalbio.com/2023/05/16/arsenalbio-announces-prese...

mcnnowak
1 replies
22h0m

Any tips or "get your foot in the door" advice on how someone currently in a CS career, but interested in biology could make this career transition? As a simple CS-minded person, I don't see any LeetCode for synthetic biology sites :)

stultissimus
0 replies
20h34m

Reading life science textbooks and substacks is a great way to get started (see [1], [2]).

Also, there is a major deficit of software engineering talent in biology research (probably because pro SWEs are too expensive for academic labs, and those labs are where much of the foundational research is done). If you have the bandwidth, part time / volunteer work with an academic lab in your area could be a great way in the door.

[1] https://centuryofbio.com/

[2] https://substack.com/profile/11154869-niko-mccarty

FLT8
1 replies
21h22m

Sounds exciting, but also a little scary.

I'm imagining a biological variant of eg. Verilog or VHDL, as this seems like the kind of domain where tools that allow for formal verification would be highly desirable.

stultissimus
0 replies
20h37m

This has actually been done for genetic circuits in E coli (bacteria) by Chris Voight's lab at MIT. Their platform is called Cello [1], and it enables interoperation between Verilog/HDL and genetic code. This kind of thing has not yet reached practical utility in human cell engineering, but companies like Asimov [2] are pushing hard in that direction

[1] https://www.cidarlab.org/cello

[2] https://www.asimov.com/

visarga
0 replies
11h4m

I was hoping someone who knows more that me to correct/expand my comment. Thanks

huytersd
0 replies
21h56m

So it sounds like event based architecture. You send out events in response to other events and the event sender is “regulated” by other events.

forceIsStr0ng
0 replies
1d

Yes this because DNA is a data structure not a physical force.

If DNA of some sequence, then interaction with electromagnetic fields, gravity, strong/weak nuclear force of xyz properties will probably result in mutation along path abc, a theory the speaker must provide replicable statistical analysis to back up or it goes in the bin.

IT obsession with data models has pushed IT workers into pseudoscience that ignores entropy, Lindy effects; structure is mutable. Physical forces are not. Data structures are not physical forces. They’re leaky mental models our biology garbage collects due to entropy and Lindy effects.

Anyway just the take of an EE who finds the obsession with software development bizarre, full of statistical mirages, hallucinated accomplishments.

koeng
15 replies
1d4h

I’ve worked in a lab of one of the original folks working on genetic logic gates, and honestly, no they don’t really work very well at all.

I think a lot of computer scientists assume that their model of the world (binary logic) is the most abstracted version possible, and you can drill down other information carrying systems to that level, and then use that knowledge to build reliable systems.

Biology does not work that way. Its fundamental abstraction is different (mostly one of massive interconnectedness). Engineering logic gates and the like, mostly, doesn’t actually allow you to build better genetic circuits.

sebastianconcpt
4 replies
1d4h

I bet there is way more quantum phenomena reasons involved in why that system was selected for being less fragile and having better stability/survaibability for a wider range of cases.

Thinking in classical compute terms there is like asking for robots that inevitably break without the slightest chance of self repair, much less do it at minimal effort.

rogerkirkness
3 replies
1d4h

Yeah I think there's a paper about how brains most likely represent some things with quantum states which may explain why we don't understand them yet.

jerf
1 replies
1d3h

There are many such papers. However they need to be understood as extremely speculative, if not outright woo-woo, as there is currently no reason to believe that the brain is capable of maintaining any sort of even modestly-large-scale entanglement properties.

These papers, and people commenting on them, also often appear to believe that just finding that entanglement is happening would itself mean something; it's not hard to read them as basically saying "and we found the 'Here There Be Magic' in the brain!" They don't even seem to speculate as to what the entanglement and "quantum" would be doing. What visible difference would it make versus a non-"quantum" brain? How could we determine that the brain is using "quantum" to do something that could not otherwise be explained or is somehow irreducible to some otherwise-classical phenomenon? Not only do I not see answers, not only do I not even really see theories, I don't even see much recognition that the question exists.

We have quantum computers right now. They are small, but they are fully quantum. They don't do anything by virtue of their quantumness. They aren't sitting there being ambiently quantum. They have to be set up, then executed, then answers read out, and the way in which they deviate from non-quantum computers is actually perfectly mathematically characterizable. And it is not clear to me exactly how those differences are supposed to be useful to describing how a brain works even if functioning quantum computers can be found. Basically, the problem of how a brain works is so vast that the question of whether or not it has some sort of quantum computer in it sinks without a trace, and certainly without providing any sort of explanation at the present time.

mrguyorama
0 replies
22h59m

People are desperate for "quantum" to be involved in the brain because it's the only possible way to resolve "Physical processes are mostly deterministic" -> "The brain is a physical process" -> "A deterministic brain means truly no free will, which has serious philosophical and ethical implications".

sebastianconcpt
0 replies
1d2h

Is it about the microtubules in brain cells?

Dr. Stuart Hameroff comes to mind [1]

His hypothesis is the "less incorrect" I've heard of.

[1] https://www.youtube.com/watch?v=YpUVot-4GPM&list=PLOfMRzYBxE...

titzer
3 replies
1d4h

I think a lot of computer scientists assume that their model of the world (binary logic) is the most abstracted version possible, and you can drill down other information carrying systems to that level, and then use that knowledge to build reliable systems.

It doesn't matter what Turing-complete model you have; they are all equivalent, even if exponentially slower than another. And no, binary logic is not the most abstracted version possible. Clearly lambda calculus is....ok, just kidding. You can build a universal computer out of just two stacks, out of subtract-and-jump-if-zero, and a lot of other things. (https://gwern.net/turing-complete)

I don't think computer scientists really think that binary logic is the only computation paradigm that makes sense, it just happens to be one that we can make extremely fast and extremely reliable realizations of using electronic circuits. It's been phenomenally successful.

yosefk
0 replies
1d3h

I don't think GP implied that you cannot simulate cell chemistry on a Turing complete machine to a reasonable accuracy given enough resources. I think the point was that you can't think of genes as logic gates.

dacryn
0 replies
1d3h

you are limiting computation to discrete computation.

Biology is analog, which is a whole different ballgame compared Turing completeness. And extremely hard to replicate with digital paradigms

NoMoreNicksLeft
0 replies
1d3h

Is it not true that some of the confusion stems from the mistaken idea that DNA is computing/computable? One might make logic gates out of plumbing, but the plumbing at my house isn't some 0.1Hz computing device, it's indoor water so I can boil pasta, brush my teeth, and flush away smelly waste.

DNA's primary (though maybe not only) function is to act as a blueprint for constructing proteins. What little logic exists in it naturally has to do with conditionally expressing those only when it is advantageous to the organism. That's not necessarily the same thing as a logical if.

throwaway914
3 replies
1d4h

I'm equating this to condition-less coding, where you're trying to eliminate branching logic with bitwise operators or other math.

chaorace
2 replies
1d4h

Yeah, except you can only change one line at a time and you lose all of your changes if it doesn't compile at any point.

koeng
1 replies
1d4h

Sex changes many lines at a time. But you do die if it doesn’t compile.

mrguyorama
0 replies
22h58m

My friend who has a defective genome including a specific enzyme that doesn't "compile" and he is plenty alive

tivert
0 replies
1d3h

I think a lot of computer scientists assume that their model of the world (binary logic) is the most abstracted version possible

This is a giant problem with software engineers, which makes us quite obnoxious.

We have mental models that work for understanding computers, and then go misapply that those models to everything else in the universe. Frequently that turns into arrogance, by insisting the right way to think anything is in computer-terms, and every other way of thinking is inferior and wrong. Then you get stuff like Engineer's Disease, where you get some software guy lecturing actual experts in other fields about how they're doing it all wrong, or declaring alien fields to be nonsense because they're different.

Bjartr
0 replies
1d4h

I'd imagine genetically driven behavior would be probabilistic and involve many overlapping duration/threshold "checks"

dawnofdusk
13 replies
1d9h

An interesting question because all of this genetic encoding and expression is probabilistic at its very core (it has to be, otherwise there wouldn't be evolution), whereas it would be very bad if basic operations in a CPU had similar level of error rates.

Interesting to me how the top comment has talked about constructing logic gates out of biological circuits. I wonder if anyone has done the opposite, i.e., write a probabilistic programming language whose operations are under the same amount of noise as a cell?

tralarpa
4 replies
1d8h

whereas it would be very bad if basic operations in a CPU had similar level of error rates

Evolution is about mutations when the executable file is copied.

withinboredom
3 replies
1d8h

That isn't how evolution works (or at least it is a gross generalization that is borderline wrong).

You own cells mutate over your lifetime and the mutations may trigger genetic features across your entire body. Sometimes, these mutations cause errors that we call cancer, but not always.

Evolution is when a certain genetic feature provides better fitness (which doesn't necessarily have to come from a mutation), and then gets selected through mating. For example, if suddenly people with brown hair were a better mate, it would be much more likely for the next generation to have brown hair, until non-brown hair were "evolved" out of the gene pool completely and it would be impossible to not have brown hair.

This is how we got orange carrots, which are distinct from the previous non-orange carrots, for example.

willy_k
1 replies
22h7m

So evolution is copying the state of a running executable (one that’s poorly memory managed and dependent on external variables) instead of the file itself, and evolution favors favors the copies that survive and maybe even run faster or are otherwise better.

withinboredom
0 replies
11h10m

Not exactly, evolution doesn’t happen instantly. A digital comparison to an organism would be something like this:

An organism is a collection a billions of threads all running the same code but starting at seemingly random entry points. Before dying, each thread forks one or more threads (depending on external factors like available memory).

In order to reproduce, the “father” code sends a copy of its current code, but only the bottom word of every byte. The “mother” program combines this with the top word of every byte and then executes it in a chroot jail to make sure it will actually run. Once it is confident it will run, it “births” it onto its own machine.

Every thread in this process is running on non-ECC memory, with cosmic rays bit-flipping things, though there are threads running around making fixes to broken threads (actually, just killing them before they can fork).

The implementation of the threads isn’t relevant here (such as modeling proteins, atp pumps, and such).

In this, evolution fitness would be less cancer (fork bombs), doing usable work, successfully mating, etc. This evolution pressure might look like preventing mating until the program is a certain age or performed certain milestones and a “score” of how well it has done its work (both partners want a good “life score” but not too high — liars could exist! — to proceed with mating).

Evolution would occur naturally over many generations. From one generation to the next, they look nearly identical, but from hundreds of generations they might look identical, or not. The “not” part is evolution.

tralarpa
0 replies
1d

which doesn't necessarily have to come from a mutation

My comment referred to OP's association of evolution with errors in genetic encoding and expression, hence my quote.

Karellen
3 replies
1d7h

it would be very bad if basic operations in a CPU had similar level of error rates.

Given how slow evolution is, and how many times DNA is copied and/or transcribed in any one individual, my intuition is that the error rates for genetic processes are actually incredibly low.

mrguyorama
1 replies
23h8m

There are numerous correction mechanisms, and other mechanisms that destroy malformed proteins and mRNA chains. Wikipedia claims one out of every 1000 to 100000 amino acids added to a forming protein is wrong. Each ribosome is proceeding to add amino acids to a chain at about 10 per second, and millions (or way more) proteins are being produced all the time.

Karellen
0 replies
17h30m

Chip fabs have QA departments that check for bad batches and make sure they never get used too, though. ;-)

Ancapistani
0 replies
23h7m

My intuition says that error rates are extremely low, but that errors are common - due to the incredible number of iterations that occur in a very short time period.

nihzm
1 replies
1d8h

probabilistic programming language whose operations are under the same amount of noise as a cell

The reason for the probabilistic nature is that biological "computations" are eletrochemical reactions and feedback loops, which do not map very well to the concept of "executing code" as in programming languages. I think a closer analogy could be a hardware description language that sythesizes analog circuits for computations (cf. analog computers) which are then subject to noise from electromagnetic radiations in the environment.

So in a certain sense, this has already been done in a very rudimentary way during the pre-digital age of computing.

dawnofdusk
0 replies
1d7h

Analog computing is trendy again, though, especially for AI.

yieldcrv
0 replies
1d7h

LLMs add just enough entropy

Daub
0 replies
1d9h

under the same amount of noise as a cell?

Can you share what you mean by this?

thih9
10 replies
1d9h

There are no equivalents of function calls. All events happen is the same space and there is always a likelihood of interference.

Proof that code with all functions inlined, with only global variables, with bugs and happy accidents all over the place, eventually gains sentience and becomes self aware.

visarga
4 replies
1d8h

Evolution is a wonderful algorithm - open ended, and yet in a single run it created all life, including human culture and technology.

Daub
2 replies
1d8h

What I find fascinating about evolution is that it is entirely un-intentional. There is no-one/nothing directing it yet its outcome presents (and can be discussed as) a design.

visarga
0 replies
1d8h

There is an imperative force internal to evolution - self replication. Information must propagate (replicate) from the past into the future. But this leads to competition for resources, and hence drives evolution.

bmartin13
0 replies
20h47m

How would you know if a higher dimensional being was directing it? We can't easily detect something outside our current physical dimensions.

We know that for our math to work there have to be a number of higher dimensions. I think it's more logical to believe this came from something instead of from nothing.

quickthrower2
0 replies
1d8h

Single run?

debesyla
1 replies
1d8h

Maybe that's the true way how to create an AGI...

pezo1919
0 replies
21h36m

We are the AGI :)

curiousParty
1 replies
1d8h

This isn't proof. Its wishful thinking. We don't yet understand the data structure of DNA fully, we don't know how a chemical storage system that is inherantly inert and relies on nanomachines (that it itself is responsible for providing the code for creating) to be useful even exists, let alone how it was made. How many happy accidents would it take to create cloudflare? This is overwhelmingly more complex than cloudflare and yet it works, and not only works robustly, but with exquisite UX.

thih9
0 replies
1d3h

Looks like we understand it enough to say it's all a big ball of mud[1], at least based on that SO comment. Or are you saying that the comment is wrong? And yes, the effect is more complex than cloudflare. Then again, it is a ~6Gb (~0.85GB) executable blob[2]; this doesn't include the training sets and it took a bit to get there. Who knows, if cloudflare spent 4.5 billion years and a whole planet of resources on their next version, the result might have exquisite UX too (plus an ability to write tangental HN comments, and many more).

[1]: https://en.wikipedia.org/wiki/Anti-pattern#Big_ball_of_mud

[2]: "In humans, the total female diploid nuclear genome per cell extends for 6.37 Gigabase pairs (Gbp)", https://en.wikipedia.org/wiki/DNA#Amount

echelon
0 replies
1d8h

Signal transduction is async amplification. And those pathways can branch. Close enough.

Daub
10 replies
1d9h

I teach a creativity and innovation course at my uni. There are many examples I share with my students of inventions which started life as observations of nature (e.g. Velcro came about from observing how burrs stick to the fur of the inventor's dog). My bones tell me that there are discoveries about computing waiting to be made from observing how things are done in nature, particularly in the human mind. I imagine the result to be fundamental: changing the very nature of how computing is conceived of.

marginalia_nu
4 replies
1d8h

A lot of how we conceive of computation today is based on observation of nature. It's the a logical conclusion to the study of logic, which is a line of inquiry that dates back to antiquity when sophistry was becoming a real institution and there was yet no real systematic way of determining if something was true other than whether it sounded true.

Daub
3 replies
1d3h

I'm not sure that nature and logic are related in the way you imply.

A different question: Is not the atomic indivisible of computing no more that a simple switch? A binary? Is there an equivalent for this in the human mind/brain? I suspect not but would like to hear from someone who knows more.

breischl
1 replies
1d1h

Is not the atomic indivisible of computing no more that a simple switch? A binary?

It is for the dominant computing paradigm currently in use, but that isn't a universal truth. As I understand it, we use binary logic primarily because it's more tractable for humans to think about and work with, not because it's the only option or even the best option in all cases. For instance there are plenty of analog computers in history, which are in some ways more closely related to biological "computers".

https://en.wikipedia.org/wiki/Analog_computer

OkayPhysicist
0 replies
1d

To my understanding, the main advantage of digital over analog logic in computing was (and probably still is) its resistance to small errors. Nonlinear components like transistors and vacuum tubes aren't exactly the most consistent parts, and factors like ambient temperature can make even otherwise linear parts a little wonky. The "on/off" paradigm gives you a lot of tolerance of deviations from the ideal. Storage is also a big factor: it's really hard to store analog values, whereas on/off is approximately trivial.

marginalia_nu
0 replies
1d2h

Logical thought is the lines of thought that consistently lets us predict nature. Without nature there would be no such thing as fact, and without fact, there can be no logic, since a proposition can't be true or false.

gilleain
1 replies
1d8h

I recommend this book a lot, but 'Cats’ Paws and Catapults: Mechanical Worlds of Nature and People' (https://www.goodreads.com/book/show/223609.Cats_Paws_and_Cat...) is a great look at how biology inspires human design.

One idea that is important (IIRC from this book) is that many examples are really inspiration, and not direct copying of design. Biology works very differently to human machines, with very different constraints, so when engineers and designers try to stick too closely to the biological original, it may not work out very well!

Daub
0 replies
1d3h

A great reference. Many thanks.

Medianmax
1 replies
1d8h

Convolutional neural networks come to mind.

ImHereToVote
0 replies
1d8h
optimalsolver
0 replies
22h29m

There's the entire field of nature-inspired optimization:

https://arxiv.org/pdf/1307.4186.pdf

akira2501
9 replies
1d9h

My favorite thing about mtDNA is that two separate genes overlap using separate reading frames. The end of one gene is the same as the start of another, and they are laid out in the circular mitochondrial genome to take advantage of this fact.

I've also read that DNA chromosomes can under go conformal changes in response to the environment it's in, making certain reading frames more or less likely to be transcribed, which makes DNA something like an environmentally sensitive memory subsystem for RNA.

Which I've always wondered if this mechanism is involved in how the homeobox genes work to alter genetic express across the "floor plan" of the body.

Anyways, I guess all this means, to whatever extent you might be able to identify programming "constructs" within the system, the overall effects are going to be dominated by noise and emergent behaviors, and the overall mode of the system is one of "feedback control loops."

6177c40f
3 replies
1d1h

I think you'll find some of the tricks that (mostly) viruses use to be interesting. Take a look at ribosomal frameshifting [1], where the viral RNA has a special structure such that at a certain point during translation in the ribosome it will skip back 1 nucleotide (a -1 frameshift; -2 and +1 frameshifts have been observed as well, more rarely), thus changing the whole reading frame.

Viruses use this to encode multiple protein products in a single strand of RNA, a sort of compression. But that's not all. Say that the amount of "0-frame" protein to "-1-frame" protein needs to be at a ratio of 20:1; then if the frameshift occurs with a probability of about 5% (i.e. 5% of the time that the viral RNA is translated), these protein products will then be produced in just the right ratio (this isn't a made up example either, see [2]). So not only does this trick allow for the RNA to be compressed, it also regulates protein expression. All with just one strand of RNA.

[1] https://en.wikipedia.org/wiki/Ribosomal_frameshift

[2] https://en.wikipedia.org/wiki/HIV_ribosomal_frameshift_signa...

bglazer
1 replies
22h57m

Fuck that’s crazy.

For the CS people who don’t understand what this means, a rough analogy is that HIV does something like steganography to encode two proteins in one gene.

dekhn
0 replies
18h38m
pyinstallwoes
0 replies
15h59m

Quine based attack vectors on DNA platforms as a way to use the substrate as a reconfigurable bootloader through changing expected address. Virus.

speps
0 replies
1d7h

The end of one gene is the same as the start of another

Kind of like "de Bruijn sequences" which can be used to reduce the total length of brute force attacks on pins.

pyinstallwoes
0 replies
1d7h

That reminds me of how the Vedas Sanskrit flows and also forms error-correction from verse to verse.

The beginning is the end is the beginning.

moring
0 replies
1d9h

You might find the coding for selenocysteine interesting: https://en.wikipedia.org/wiki/SECIS_element

tl;dr: selenocysteine doesn't have a normal coding as a base triple, but as re-interpretation of a stop codon due to the information stored _after_ that stop codon that makes an RNA stick to itself during translation.

dekhn
0 replies
1d1h

overlapping genes show up all over the place and I think they contribute to lots of errors in papers. During my postdoc I analyzed the results from a yeast gene survey where the authors ignored genes overlapping and misannotated genes, imputing their function when really it was the overlapping gene. This is where I started to learn that "if it's in Nature, it's probably wrong"

collyw
0 replies
1d7h

Is that specific to mtDNA? From what I remember it isn't that uncommon. There are a lot of bits of DNA that work in both directions for different genes.Been a a couple of decades since I studied that though.

bmartin13
6 replies
1d1h

I'm pretty sure something this complex came from a higher order being. That whole order from chaos argument is not holding water for me with this degree of complexity.

andreyvit
2 replies
1d

I'm afraid your intuition about complexity misleads you here. If you'd like to learn more, spend some time with say evolutionary algorithms. It's not a big time commitment; playing around for a couple days will likely be enough. You'll find how the crap an evolution-style search comes up with is really similar to how nature works: a few beautiful core finds + a big pile of hacks of varying craziness, some of them totally bonkers, and quite a bit of insane reuse of things one would never consider reusing.

mensetmanusman
0 replies
17h58m

The operating system of nature is just as complex.

bibelo
0 replies
5h6m

Where do these algorithms come from?

IanKerr
1 replies
1d

And that complex higher order being came from where?

bmartin13
0 replies
20h59m

No idea, but the math says there have to be way more dimensions than we can interact with, so I'm thinking it's a higher dimensional being that we can't interact with. It/they would have to enter our dimension to communicate with us.

thechao
0 replies
1d

I see the opposite: the DNA/RNA/protein mechanism is just a gigantic pile of self-interacting hacks — core "design constraints" being that the cell: (1) shouldn't die too quickly; and, (2) should be reasonably energy efficient (due to competition).

Sparkyte
4 replies
1d8h

That is an interesting thought. I take this with a grain of salt, one gram of DNA is supposedly 455 exabytes of data.

dekhn
3 replies
1d

Here, let's do some math. let's treat a base pair as 2 bits of data (4 bases, 2 bits to encode a base). The average molecular weight of a DNA base pair is twice this, approximately 660 daltons (or 660 grams/mole).

So a gram of DNA is 1/660 moles, or 6.023 * 10 * 23 / 660, which means a gram of DNA is 9 * 10 * 20 base pairs, making 1.8 * 10 * 21 bits, which is around 900 exabytes which is very close to a factor of two from your estimate.

(also, in case it comes in handy: pure water is 55Molar)

pama
2 replies
22h54m

Thanks for the neat estimate. It would be hard to dehydrate DNA to a significant degree, so adding some water and counterions would probably reduce your estimate by a bit. (Error correction would also reduce things by a bit.) Overall the estimate of ~500 exabytes per gram seems reasonable to me.

dekhn
1 replies
18h30m

I think there are worse ideas than cloning your data into a population of tardigrades using an error-correcting code, and then dehydrating them into the tun state. Not nearly the density of raw dry DNA (which itself can be stabilized using trehalose or PVA) but tardigrade tuns can resist harsh environments for ~millenia.

pama
0 replies
2h30m

Agreed. I hope there will be commercial scale inexpensive solutions of this type in the next ten years.

zubairq
2 replies
1d1h

I’ve always wondered about DNA from a programming language point of view and always thought it was more like a data structure than a language

abhiyerra
1 replies
1d1h

I’ve always thought Lisp is the closest approximation to DNA.

__s
0 replies
1d1h

I guess you're thinking of functional lisp? SKI combinators except instead DNA is GCAT combinators

scrozier
2 replies
1d2h

In middle age, I (a computer science grad and working programmer) went to graduate school for molecular biology. In the early days of studying biological systems, I was struck by how much they were like computer code, or at least logic constructs. I remember to this day the moment I realized that my orderly world of 0s and 1s was being saturated and methylated and energized in ways that were utterly unpredictable by (the current state of) simple logic constructs. And that's the day I learned why we still have "wet" labs, to actually test things empirically, rather than vainly try to predict them computationally. :-)

herval
1 replies
1d2h

Stephen Wolfram's theory is that everything is a massively parallel computation. Pretty fascinating stuff

https://writings.stephenwolfram.com/2020/04/finally-we-may-h...

zelphirkalt
0 replies
1d2h

Does he define computation? I am asking because just about everything could be seen as some form of computation, given no definition, duh.

psyklic
2 replies
1d8h

In grad school, I (briefly) researched synthetic DNA logic gates. Using these, you can build neural networks in DNA [1] and probabilistic switching circuits (my co-authored paper) [2].

To do this, you design DNA strands that bind to each other in designated regions when mixed.

To make a logic gate, you just make some bindings conditional on other bindings. To detect that the gate is closed, you make the final binding uncover a fluorophore, which is detectable by a machine. (This is all called a *displacement cascade*.)

Using our techniques, the DNA bindings could not be undone. However, I imagine that a source of new DNA strands to replenish the old would effectively implement re-callable functions.

[1] Neural nets in DNA: http://qianlab.caltech.edu/nature10262.pdf

[2] Probabilistic switching circuits: https://www.pnas.org/doi/full/10.1073/pnas.1715926115

samplatt
0 replies
1d1h

Fascinating. Have you read the story story, The Moral Virologist? I've always been curious as to how close to realistic it was, and you seem a person that could answer that.

Aardwolf
0 replies
1d7h

I remember seeing a video by Steve Mould about exactly this (playing tic tac toe against DNA): https://www.youtube.com/watch?v=GgPdRKqcRTE

gameman144
2 replies
1d1h

One really cool mechanism I didn't see listed there (at least directly) was riboswitches. Essentially, a segment of mRNA with a binding site for a molecule. Ribosomes that are transcribing the RNA will then branch into two separate flows depending on whether the molecule is bound or not.

https://en.m.wikipedia.org/wiki/Riboswitch

pelagicAustral
1 replies
1d1h

Polymorphism?

actionfromafar
0 replies
1d1h

Or spaghetti assembler code

robviren
1 replies
1d6h

I always think of DNA a lot like programming over the course of millions of years. A strung together series of hackish, barely functioning, uncommented code with no documentation whatsoever. Any reason for why a snippet of code ended up being the way it is has been completely lost to time. All we know is that changing it is bad, certain chunks of code lead to certain behavior, and that the more we look at the code the code spaghetti like it appears.

InSteady
0 replies
1d1h

The ultimate spaghetti codebase.

platz
1 replies
1d3h

    IF : Transcriptional activator; when present a gene will be transcribed.
    WHILE : Transcriptional repressor; gene will be transcribed until repressor is not present.
as stated, IF and WHILE are equivalent (the WHILE is some variation of a contrapositive of an IF)

secondly, it doesn't make sense that a "repressor" would cause transcription unless it is absent, since " A transcriptional repressor is a protein that regulates gene expression by inhibiting the initiation of transcription".

It is a repressor's PRESENCE that inhibits expression, not it's ABSENCE. Hence it would make more sense to say "Transcriptional repressor; gene will be transcribed until repressor is PRESENT."

holmium
0 replies
1d

Maybe there were thinking about it in terms of disinhibition?

    A --| B --| C
Where `A` inhibits `B` which inhibits `C`. So, while the repressor `A` is present, `C` will be transcribed. I'd imagine that simple repression is probably more common than disinhibition in gene networks, but idk.

nihzm
1 replies
1d9h

Related to the first answer: Tim Blais on youtube has made a catchy "edutainment" song [1] on molecular machines based on A. Leigh's research (eg [2]) with some cool animations that show how electrochemical "switches" can encode a binary state, which one could (in principle) use to build logic gates.

[1]: https://www.youtube.com/watch?v=ObvxPSQNMGc

[2]: https://pubs.acs.org/doi/10.1021/acs.chemrev.5b00146

yread
0 replies
21h33m

There is this 2021 paper where they constructed a logic OR gate protein:

https://www.nature.com/articles/s41467-021-26937-x

Some companies are planning to use them when designing drugs: "only activate this if both antibodies bind"

hnthrowaway0328
1 replies
1d

Judging from:

It is also to be noted that DNA is just a set of instructions and not really a fully functional entity (it is functional to some extent). However, even being just a code it is comparable to a HLL code that has to be compiled to execute its functions. See this post too.

Maybe microcode?

spaceribs
0 replies
23h56m

After reading the reason why code isn't a good analogy, the mechanical/analog logic of old pinball machines might be a better example of how DNA processes instructions?

Ref: https://www.youtube.com/watch?v=E3p_Cv32tEo

dekhn
1 replies
1d2h

Yes, DNA has rough equivalents to condition logic and loops, although not in a way that is as engineerable as a digital computer with a stack. The op-amp (which is a transistor that uses feedback, typically to amplify) is the one element that has a close analogy (hah) in biology (feedback systems based on transcription-controlling regulatory proteins).

The other really cool example taht came up early in my education is the yeast mating switch. Yeast (s. cerevisiae) can be one of two "genders" and can only mate with the opposite gender. Yeast can change their gender: they keep two copies (alternates) of a single gene, and "cut/paste" one of the alternates into a different "currently active gender" location (https://en.wikipedia.org/wiki/Gene_conversion is the general process, https://en.wikipedia.org/wiki/Mating_of_yeast#Mechanics_of_t... is the mechanical process in yeast). It has some similarity to a flip flop.

So I think the right level to think at is not the abstraction of computer programming languages, but circuit elements. Which is what makes these brilliant articles so much fun: https://www.cell.com/cancer-cell/pdf/S1535-6108(02)00133-2.p... and https://journals.plos.org/ploscompbiol/article?id=10.1371/jo...

To cut to the chase, engineering biology is a major pain due to the long evolutionary history of biological mechanisms, which makes them somewhat esoteric and abstruse. there is so much that can be said, but over tme, I've found that expounding on this is not very helpful.

arethuza
0 replies
1d2h

Isn't this the ultimate case of "Software archeology in a mature programming environment" (as Vernor Vinge might put it)?

arketyp
1 replies
1d9h

It's curious how DNA and enzymes are quite literally like a Turing machine tape and head.

cantachau
0 replies
1d8h

The only question is : Is DNA a Turing-Complete machine ?

JoBrad
1 replies
1d3h

So in Biology everything is a global variable and side effects are the standard. Got it.

pama
0 replies
1d

Global variables are true for viruses or single cell organisms. There is locality between different cells and partial locality within cell compartments. So not everything is a global variable.

AlbertCory
1 replies
1d1h

PCR was partially inspired by loops in programming languages, according to Kary Mullis, who got the Nobel for its invention.

(I heard him say it at Google, but I just checked YouTube and I don't think that particular visit is there. Maybe it's in other places?)

cheschire
0 replies
1d1h

I had to check Kary Mullis's wikipedia page to figure out that PCR stood for polymerase chain reaction. While there, I got the distinct impression someone who disliked him was responsible for editing quite a lot of his article.

wcoenen
0 replies
22h40m

To get a glimpse of how cell "logic" works, I think clocks are a nice starting point.

https://www.youtube.com/watch?v=nq7v1Z2SmdQ

subroutine
0 replies
1d8h

All the various logic gates have been observed in (DNA) gene expression studies - both naturally occurring and bioengineered tools. In particular there are tons of bioengineered tools for inducible gene expression that have all manner of gating mechanisms for precise temporal control of expression (triggered by behavior, environment, drug, elapsed time, etc.).

Reading...

https://en.wikipedia.org/wiki/Cre-Lox_recombination?wprov=sf...

https://en.wikipedia.org/wiki/Receptor_activated_solely_by_a...

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2802553/

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4772104/

sojuz151
0 replies
1d4h

Alternative splicing is another way of including IF statements. Depending on many conditions some parts of the RNA might be added/removed before the translation

shpx
0 replies
1d8h

What about length prefixed strings? There's obviously "byte" terminated strings (start and stop codons) but there's probably situations where it makes sense for biological systems to store the length (or likely reading different lengths produces different molecules, as a form of compression) at the beginning and count it out.

sargstuff
0 replies
21h44m

Programming languages typically operate in middle of one higher/ one lower order level of logic.

DNA describes hyperbolic multi-level of logic.

The lowest level programming language (assembly) is designed to

map/correspond/translate directly to hardware electronic bits.

Where, cpu emulates version of turing machine on assembly instructions.

And cpu/hardware is under control of OS ( post-pc boot strap ).

---

The mapping between hardware & assembler may static (compiled) or

interpreted in real-time into a sequence of boolean AND's and OR's.

At the assembly programming language level, the programming language

'TERMINAL' construct is 1:1 with hardware. Such abstractions typically

conveyed using BNF and/or various calculi such epsilon and lambda.

As one progresses through higher level programming lanugages, have to

work one's way through more and more non-ternmal instructions (aka grammar)

to get to just the TERMINAL value to access that 1:1 mapping to 'run'.

---

Given constraints on article length, going to reduce the bodies hyperbolic geometry

descriptions down to a simplified analytic geometric analogy via bodies largest organ -- the skin.

Can look at the skin as mobious strip where, gi tract is internal side compliment of external skin.

Organs can be views as klein bottle chained together & interact/influence each other (per flow through

klein bottle opening and/or perfusion of klein bottle surface.

Scaling down the cline bottle to cell & simplfied 2d spread sheet -- side a, side b and how progress between a & b.

The external/internal enzymes, dna, vitimins/minerals etc influence what happens on the spreadsheet

via dna bnf / 4 symbol autonoma system and/or L-system.

internal function calls equivalent to specific inter-cel functions per triggered by cell resources.

Obviously, organs/cells need 'external function calls/libraries' to work, which are not covered here.

While loops / if loops defined by enzyme/protines/etc created by dna interpretations. aka

while enough protine a exist -> create enzyme b. IF enzyme asdf exists -> create protine 1234

---

So, dna can be viewed as a type of turing machine aka has if / while / etc.. forms of expression.

the rest of the body is just a lambda calculi / epsion calculi / process calclui interpreter/executor.

DNA as the ultimate hyperbolic turing paper tape version of the Zark king[0]

The unbounded creaping feature creature take to the extreme over a few millenea.

---

[0] : https://manybutfinite.com/post/the-thing-king/

rsingla
0 replies
1d

For an individual interested in computational biology, or George church's course is excellent.

From the description: "This course will assess the relationships among sequence, structure, and function in complex biological networks as well as progress in realistic modeling of quantitative, comprehensive, functional genomics analyses. Exercises will include algorithmic, statistical, database, and simulation approaches and practical applications to medicine, biotechnology, drug discovery, and genetic engineering."

https://ocw.mit.edu/courses/hst-508-genomics-and-computation...

pockmockchock
0 replies
1d9h

isn't mRNA, crisper, prime-editing doing exactly that?

photochemsyn
0 replies
1d3h

IF lactose is present in the extracellular environment && IF glucose is NOT present in the extracellular environment:

Activate the lac operon and transcribe the genes for lactose uptake and metabolism, entering a while loop:

WHILE lactose is present: Keep transcribing those genes.

However, this is not the DNA acting alone, it's just a component in the cellular system. The cell (E. coli in this case) is always generating a minimal amount of lactose and other sugar transporters, which act as sensors in the cell membrane that trigger feedback loops (otherwise, how would the cell ever know lactose was present?). If glucose is present, E. coli uses it preferentially and downregulates other sugar uptake and metabolism genes. This is all fairly analog rather than digital, i.e. genes that are 'turned off' may still be transcribed at a very low level.

DNA is probably more comparable to RAM, and the CPU perhaps comparable to the transcriptome (DNA -> mRNA) and the ribosome (mRNA -> protein), and overall the cell is running something comparable to a parallel multi-process fetch-decode-execute cycle.

nyxtom
0 replies
1d

We are never going to move beyond this analogy are we? We seem to be stuck here :/

newphonecartel
0 replies
19h44m

2nd to the motion this post is lazy, though it did produce some somewhat interesting commentary. I wish more people would see the light as far as SE goes. Here is one perhaps worth reading on that subject: https://superplane39.medium.com/an-interview-with-monica-cel...

...but i really just wish there were more alternatives.

marvinborner
0 replies
1d2h

(Slightly) related: Chemlambda[1], simulating combinatory logic and lambda calculus with molecules.

[1] https://chemlambda.github.io/index.html

martythemaniak
0 replies
1d3h

I got into synthetic bio for a little while and ended up reading this intro textbook: https://www.amazon.ca/Introduction-Systems-Biology-Principle...

If you've taken "Systems Engineering" or "Control Systems" type classes where you learn about oscillators and other high-level control systems work across a variety of engineering disciplines, this is a great into to how these systems are "implemented" and work inside of cells. It's not "IF" and "WHILE" loops, but more like "How are logical circuits such as e.coli path finding algorithms implemented in RNA"

markzporter
0 replies
23h34m

This set of slides provides a good rundown of some of the work done in DNA computing: https://cs.uwaterloo.ca/~lila/pdfs/Computing%20with%20DNA.pd...

Notably solving TSP using the mechanics of DNA. Very interesting!

lo_zamoyski
0 replies
21h38m

The question is misleading because computation is not an objective phenomenon in nature (save in the minds of observers). You can interpret natural phenomena computationally, sure, and there may well be a way to do this with DNA (in fact, some have done that). But there's no computation "out there". Even the computer in front of you isn't objectively computing. It behaves in a way that corresponds with a computational interpretation, but nothing going on inside of it is a computation operation per se.

kul_
0 replies
1d3h

I misread this for some reason and I am now interested to know if DNS can be made to work dynamically this way instead of being just preconfigured static records?

kovacs_x
0 replies
1d3h

nope, chemistry/physics involved & dna processint doesn't work that way. (Msc Comp.sci&Bioinformatics / sw engineer here)

it's very much about physical environment and chemical reactions happening (all in parallel) and molecules with different sizes/shapes interacting in specific conditions not instructions for specific computer architecture being executed.

ps. i'm not saying its not possible, it's that it doesnt happen that way in living cells.

kirykl
0 replies
20h29m

Maybe it does but in a higher dimensionality that we cannot see

johnnyb_61820
0 replies
1d3h

DNA is a lot more like an FPGA than a traditional programming language. Basically, everything is wrapped in a while(true) that has conditionals as to whether or not it engages. Promoter sequences are the condition statements, and transcription factors are very similar to variable states, where the concentration of the transcription factor is essentially its value.

i000
0 replies
1d4h

To give a concrete examples

1. The HOXD gene cluster adopts a mutually exclusive conformation. IF it "folds" to the left if results in the formation of digit bones ELIF it folds to the right of arm bones ELSE no bones etc. (very very roughly speaking): https://www.science.org/doi/10.1126/science.1234167

2. Another example are olefactory receptors, each olefactory sensory neuron (the cell that "smells") chooses to activate one out of a long array of possible receptors each specific to some smells. So somehow, somewhere a XOR logical operation is "computed" to "pick" https://europepmc.org/article/pmc/4882762

gwern
0 replies
22h57m
fjfaase
0 replies
1d8h

The KMT2D gene [1] is one example of a gene that I know of that regulates the express of other genes. Defects to this gene often result in the people having the Kabuki Syndrome [2].

If I remember correctly, Bert Hubert in his talk 'DNA: The Code of Life (SHA2017)' [3] gives an example of an IF-behaviour.

[1] https://en.wikipedia.org/wiki/KMT2D

[2] https://en.wikipedia.org/wiki/Kabuki_syndrome

[3] https://www.youtube.com/watch?v=EcGM_cNzQmE

feverzsj
0 replies
1d8h

It's like reverse engineering alien technology.

drones
0 replies
1d2h

With our archaic use of Von-Neumann architecture, bacteriophages must think so little of us.

dazhbog
0 replies
1d9h

Ghidra plugin for DNA blobs, anyone?

corethree
0 replies
1d2h

It's more like a 3d printer. But that 3d printer can print a computer which in turn has while loops, function calls and if statements.

cariaso
0 replies
1d3h
bilsbie
0 replies
1d2h

I feel like the only possible answer is “we don’t know”

__loam
0 replies
23h6m

It does not.

Rezhe
0 replies
1d6h

Rather than programming languages it sounds like a nature language.

DoreenMichele
0 replies
1d8h

IF-statements: An IF statement executes the code in a subsequent code block if some specific condition is met.

Cell chemistry can impact production of the protein strings DNA codes for and can impact if they are even functional.

Not a programmer, so maybe that's not a good analogy. But it's what came to mind as an amateur student of "Exactly how do my defective genes make my life a living hell?"

Alpha3031
0 replies
1d6h

Carroll is probably one of the better known researchers focusing on the evolution of cis-regulatory elements, non-coding sequences which regulate the expression of nearby genes. I would consider his 2005 book Endless Forms Most Beautiful to be a reasonable introduction. For something more less pop-sci I understand Davidson and Peter's (2015) Genomic Control Process has been fairly well received, as well. Though both are on the older side of things now, and admittedly I'm not too familiar with the current literature, they would probably be my recommendations. There are also non-gene regulatory elements of course.

6177c40f
0 replies
1d1h

One thing I've learned studying biology is that, while there are many decent anologies to computers, biology is really operating on a very different paradigm to the sort of programming most of us are used to.

Biology is based on "primitives" like feedback loops. For example, there are many cases where a protein that gets produced at the end of a signaling chain then goes on to stop or otherwise downregulate its own production (a negative feedback loop, positive feedback loops also exist).

Another fact that biology relies on is that increasing concentration of some molecule is essentially equivalent to increasing probability of some reaction occuring. In a negative feedback loop, for example, as the concentration of the end product increases, the probability of the reaction that that product takes part in that inhibits its own production increases. Systems then evolve such that the maximum amount of inhibition is most likely to occur when the end product is at just the right concentration.

I've often thought about making a programming language that simulates this paradigm to some extent; a probabilistic programming language where logic was implemented as feedback loops.