18
$\begingroup$

Assume the ability to cybernetically implant a powerful enough computer to operate an AI inside the human skull, on top of or within the brain. Thin wires connect the computer to the various parts of the brain. How can such a computer be cooled?

  • Technology is more advanced than today, so assume assume the need to cool 150 watts.

  • How the computer is powered is not a part of this question. Assume whatever power source is used is not generating additional heat.

$\endgroup$
14
  • $\begingroup$ Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. $\endgroup$
    – Community Bot
    Commented Jan 17, 2023 at 3:25
  • 6
    $\begingroup$ Terry Pratchett already did this, Detritus had a special helmet. $\endgroup$
    – Pelinore
    Commented Jan 17, 2023 at 6:41
  • 1
    $\begingroup$ @Pelinore +1 for the reference, although in fairness, Detritus's brain basically was a conventional silicon based CPU. $\endgroup$
    – user86462
    Commented Jan 17, 2023 at 8:26
  • 1
    $\begingroup$ I think it doesn't need to be 150W. Nature has already proven that a system as good as the human brain can operate on 10~20W (that's what the brain uses after all). It's just a matter of our technology getting good enough to replicate or even surpass that. Take a look at the artificial neural network chips (North Star) that IBM is developing for DARPA. They do the equivalent of 100s of millions of MAC operations per second on like 60mW. $\endgroup$
    – user4574
    Commented Jan 17, 2023 at 16:52
  • 1
    $\begingroup$ @stix Requiring the human body to dissipate 2.5 times the amount of heat it's evolved to dissipate makes 150W the exact opposite of "not much". $\endgroup$
    – Ian Kemp
    Commented Jan 19, 2023 at 15:54

12 Answers 12

27
$\begingroup$

Blood.

Okay, hear me out: Liquid cooling cools on contact. There's I think 9 pints/4.5 Litres of Blood in the Human Body - that's a really big heatsink. And it's always being circulated.

You may need to do a little extra work in terms of the installation of the device, perhaps routing it by a major arterial vessel to get the maximum mass flow rate - but apart from that, it would fulfill your requirements.

$\endgroup$
7
  • $\begingroup$ Comments are not for extended discussion; this conversation has been moved to chat. $\endgroup$
    – L.Dutch
    Commented Jan 19, 2023 at 1:29
  • $\begingroup$ Note that you could easily make this into "blood-style cooling" if it becomes problematic to dump that much thermal energy into the circulatory system. Install a separate, parallel network of vein-like tubes that pump bio-safe coolant across the brain and to a heat exchanger in the lungs. More work to install, but you can design it to fit your needs and minimize the impact on the rest of the body. $\endgroup$
    – bta
    Commented Jan 19, 2023 at 1:40
  • 1
    $\begingroup$ This was my first thought as well. A human makes something like 100W just existing, so adding another 150W of heat for the body to dissipate may require other adaptations as well (elephantine ears?). $\endgroup$ Commented Jan 19, 2023 at 4:16
  • $\begingroup$ +1, blood is used for exactly this purpose too in real life. $\endgroup$
    – vsz
    Commented Jan 19, 2023 at 9:08
  • 2
    $\begingroup$ This is not possible. In an ideal circumstance where all of the 150W of heat is dumped directly into blood that immediately exits the brain, a calculation based on flow rates and the specific heat of blood shows the exiting blood would be warmed by about 3.5C. That means the brain tissue around the exiting blood will also be warmed by almost 3.5C. This is equivalent to a fever of 105F. Do this for long and that brain tissue sickens and dies. And that's the ideal circumstances where all of the heat from the implant goes perfectly evenly into the blood, none directly into solid tissue. $\endgroup$
    – causative
    Commented Jan 19, 2023 at 18:09
21
$\begingroup$

This is a frame challenge but a very reasonable one.

Why would you do that??

Why putting so much processing power in your head?
In most cases what is really needed is not on board processing power but an interface that connects to the brain on one side and to one or more computers on the other. Hopefully wirelessly, not with a bayonet like plug as in Matrix.
It is what is known as Brain Computer Interface.

A BCI is a computer-based system that acquires brain signals, analyzes them, and translates them into commands that are relayed to an output device to carry out a desired action. Thus, BCIs do not use the brain's normal output pathways of peripheral nerves and muscles. Brain Computer Interfaces in Medicine

Think of it. First of all you need connectivity anyway in order to be effective in your world. You mainly want to send request for complex queries and receive results already analyzed and simplified in a way our brain can process so you can make an informed decision. You would probably communicate with AI algorithms that have learned to process your queries and have been trained in the kind of results you will want. But none of that query processing has to be done inside your head. Also in order to be done it may need to access further online resources. So it makes most sense to have the processing done outside of the BCI.
On the other side you need an implant that may communicate effectively with your brain. It may need to go under a period of training to adapt to the user (maybe in their childhood). BCIs that use ECoG seem to be more promising for the resolution needed in this process.

BCIs That Use ECoG Activity.
ECoG activity is recorded from the cortical surface, and thus it requires the implantation of a subdural or epidural electrode array. ECoG records signals of higher amplitude than EEG and offers superior spatial resolution and spectral bandwidth.

So, in the end, given that heavy processing is not required heat production would be minimal. It's dissipation would not be much of a problem. But you can imagine a simple solution:

HAIR

Metallic hair

Hair for BCI users could be metallic in nature for heat dissipation. You may not need to have all hair as heat sinks, just a small percentage of them would do, with the others made of syntetic fibre.

$\endgroup$
12
  • 1
    $\begingroup$ I agree. A wireless brain machine interface is the answer. It pushes all the processing into a computer somewhere else. That 150W then becomes mW. The computer could be a huge data center, something you are carrying around (or even a hybrid of both). That's sort of the end goal for something like NeuraLink. $\endgroup$
    – user4574
    Commented Jan 17, 2023 at 16:39
  • 4
    $\begingroup$ So that's what those punk hairstyles were! Heatsinks! $\endgroup$
    – Pablo H
    Commented Jan 17, 2023 at 17:30
  • 1
    $\begingroup$ This has the huge advantage that you don't need brain surgery to upgrade your janky old iBrain 6 to a hot new iBrain 7 $\endgroup$
    – Robyn
    Commented Jan 17, 2023 at 22:18
  • 2
    $\begingroup$ Sweet! Hair! And I did not see it coming. For added coolth you could whip your hair back and forth. Or even if you were already cool. $\endgroup$
    – Willk
    Commented Jan 18, 2023 at 0:34
  • 4
    $\begingroup$ @Willk BCI could induce headbanging as a reflex in case of interface overheat. Rock handsign optional. $\endgroup$ Commented Jan 18, 2023 at 9:10
11
$\begingroup$

This technology already exists, and operates on a similar principle of pushed-air convection as the air-cooling mechanisms for non-cyborg computer installations:

enter image description here

The use and advertising of thermally conductive hair pastes or gels has seen a marked increase since the inception of the brain-chips, while male pattern baldness seen typically in engineers is touted as an adaptive genetic advantage, enabling higher clock rates.

While laboratory testing has yielded mixed results, no commercially viable liquid nitrogen cooling system has yet been offered, owing to the complexity and difficulty of maintaining a livable body temperature in the organism in the presence of the cooling matter and under widely fluctuating thermal dissipation requirements (not to mention the added bulk of fluid reservoirs atop the head; largely only avid AR enthusiasts are found among the early adopters and inventors of prototypes). Lightweight extruded or shaved aluminum heatsinks are still trying to gain traction despite cultural opposition and conflation with the tin-foil variety of cranial apparel, therefore mainstream applications are ordinarily limited to specially designed thermally conductive plastics or ceramics for reasons of social acceptability.

$\endgroup$
1
  • 1
    $\begingroup$ Before the lockdown for the coof, I used to attend a certain international convention almost every year. The executive committee for this convention used to wear devices much like this. fancyclopedia.org/Propeller_Beanie Now I know why. :^) $\endgroup$
    – Boba Fit
    Commented Jan 17, 2023 at 19:17
9
$\begingroup$

radiator fins

create a hole in the skull with a large fin protruding from the top of the head that can be used to radiate heat.

Think something like the Yondu's headpiece from guardians of the galaxy

Yaka Arrow Controller

$\endgroup$
4
  • 4
    $\begingroup$ This adds a lot of room for cyberpunk style fashion and I like it. Instead of fins it could be horns, spikes, fake animal ears, medusa style snake hair or other tubes. They could have other features or uses in addition to cooling. $\endgroup$
    – Toddleson
    Commented Jan 17, 2023 at 15:18
  • 2
    $\begingroup$ This is the solution Alastair Reynolds uses for the advanced Conjoiners in the Revelation Space universe. For extra visual effect, the fins radiator surfaces are densely packed enough to create iridescence $\endgroup$ Commented Jan 17, 2023 at 19:31
  • $\begingroup$ @thegreatemu conjoiners was my first thought when I saw the question, but thought I would go for something people would be more familiar with visually $\endgroup$
    – mgh42
    Commented Jan 18, 2023 at 0:13
  • $\begingroup$ Also note the knock-on effects this will have on architecture, vehicle design, and helmet/spacesuit construction. As someone who already clips door frames, I welcome a world of 8+ foot door lintels everywhere. $\endgroup$
    – Criggie
    Commented Jan 19, 2023 at 22:57
3
$\begingroup$

Remove bone, replace by chip+radiator+heat insulation.

Actually, they already do that. You cut a piece of skull (bone) out to get to the parts below. It's called craniotomy.

Now usually one would put the removed bone back in, but instead, we just implant a chip module insulated downwards (towards the brain) and radiating upwards. Put back the skin above it, maybe modified for better heat dispersion and to better survive heat and you won't see a difference.

You can cool the inside with liquor and/or blood if necessary.

$\endgroup$
2
  • 1
    $\begingroup$ Ooh! Yeah! I'm picturing an aluminum Mohawk. Spiky, of course! $\endgroup$ Commented Jan 18, 2023 at 0:13
  • $\begingroup$ You could replace the bone of the scalp with very thermal conductive materials, to direct the heat to the thermally conductive hair. While metals are nice (silver, copper?, or aluminium) Something mostly carbon like diamond is much more conductive. And while we are at it, why not replace the whole skull to increase the exchange area? Crystal skull! $\endgroup$
    – vinzzz001
    Commented Jan 18, 2023 at 15:02
3
$\begingroup$

OK hear me out. You have a hole in the skull that you drop ice cubes in. The ice keeps the implant safe and cool. You do a handstand periodically to empty out the melted ice before putting new in.

Seriously, you cannot have a 150W appliance inside the head. If your brain temperature rises by 2 degrees Fahrenheit you are seriously sick. If it rises by 5F you are unconscious. Much more than that and you are dead. The brain itself is only about 30W, so you would be multiplying the heat dissipation requirements by six. You can't even let the exterior of the device be 5 degrees warmer than the brain, or the brain cells in closest contact with it would start to die. Can you imagine any 150W computing device that doesn't even get 5 degrees Fahrenheit warmer than its environment? Especially when the device is completely enclosed in an insulated pocket (the brain/skull).

Maybe you could do it by pumping liquid nitrogen from an external reservoir, but you'd better be damn sure that you are keeping every square cm of the implant at exactly the right temp, neither 5F too hot nor 5F too cold, or you will suffer brain damage or death. If the cooling system fails for just a few minutes or has a bug, say goodbye.

$\endgroup$
3
$\begingroup$

An efficient approach would be to do something like the character Lobot from The Empire Strikes Back: a cybernetic implant that fits around the back of the head. This approach has a number of advantages:

  • Nearly all of the electronics are outside of the body, so they can be air-cooled conventionally.
  • Only the interface portion is inside the body, which minimizes the risks and medical costs associated with implantation.
  • Upgrading, repair, and maintenance only require a technician, not a surgeon.
  • Things that sit on the outside of your body are not subject to the regulations and safety requirements of something that goes inside the body, so implant designers have a lot more flexibility and can release product faster.
  • Unlike an internal implant, external electronics can give you all manner of blinkenlights.
$\endgroup$
2
$\begingroup$

I think you simply don't need as much power. At the moment a modern arm cpu like the m1 pro has a peak wattage of 30w. I think it's reasonable to say that a technological society that manages to make a cpu interact with the brain directly can manage to make these even more energy efficient.

A 15-20w peak cpu will be way more manageable and still be more than fast enough for whatever you want to do. 150w are just not feasible for a mobile processor. You would probably also need to eat about twice as much just to keep up with the new energy consumption.

I would probably think about placing it a bit further down the spine so you don't need to care about thermal isolation as much, since the brain hates temperature changes.

$\endgroup$
2
$\begingroup$

Ah, the good ol' brain-frame computer. All the computing you need right at the tip of your neurons, always on, always ready to play a quick game of Doom at the drop of an eyelash. A mainstay of cyberpunk and certain other types of (extremely) speculative fiction.

With current technology this doesn't stand a chance of happening. There are so many problems with the idea that we've basically stopped trying. No, seriously. We have literally stopped trying to put computers in peoples' brains, because it turns out to be a really bad idea. (The Declaration of Helsinki probably has something to do with it too. Spoilsports.)

What we're doing these days is running wires to send data back and forth to implanted electrodes connected to various neurons. Mostly forth, since reading data from the brain turns out to be bloody hard to do. Most commonly this is used to allow deaf people to hear... kind of. Electrodes in the cochlea to stimulate the auditory nerve are attached to an induction pickup in the skull, which is fed data from an external hearing aid.

There are several good aspects to this kind of arrangement, not least of which is that the hearing aid (the external part) can be replaced almost instantly simply by swapping it out for a new unit. All of the internal elements are chemically inert, no internal power supply is needed and almost all of the heat generated by the system is in the external unit. Winning!

OK, so we also used brain electrodes to remote control cockroaches. For spying, of course. Can't you just imagine a little army of cockroaches with camera packs and tiny little microphones, all controlled by a room full of bored remote operators in Langley or something? (Kinda sounds familiar now that I think on it.)

Meanwhile, Elon Musk has decided that humans need to join with computers to bring about the singularity. So far they have a device that works in pigs and monkeys to read data from the motor cortex, so that's something. The eventual goal is to be able to put electrodes in a human brain so we can transfer data between that brain and a computer, merging humans with their technology.

The question "how can I stick a computer in my head" might not be the best one to ask. Perhaps "where can I stick a computer to talk to the electrodes in my brain" might be better. Wearable computers that you can take off when you shower perhaps? A computer in a belt-pack or a neck ring? Maybe. Or maybe we just go with Bluetooth. Everthing else seems to use it.

$\endgroup$
1
$\begingroup$

A hollow CPU chip with refrigerated glycol or water pumped into the centre and back out

Make the chip cup shaped and cool the chip from the inside, not the outside. This uses the CPU itself as an extra protective layer against leakage, as well as a thermal barrier.

It makes the geometry much simpler: simple pipes in and out on the inside of the CPU, wires on the outside.

$\endgroup$
7
  • 1
    $\begingroup$ CPU chips are basically thin rectangular solids, they have some layering, but can basically be thought of as 2d rectangles. Making them 'hollow' could in no way make the geometry simpler. $\endgroup$
    – Glen Yates
    Commented Jan 17, 2023 at 23:11
  • 2
    $\begingroup$ The circuitry itself doesn't need to bend, just the case; in this instance, the full "chip" would be a sandwich of insulation -> CPU -> metal cooling surface -> coolant flow area -> insulation, with further insulation on the front/back/sides. $\endgroup$
    – ArmanX
    Commented Jan 18, 2023 at 21:17
  • 2
    $\begingroup$ So long as the refrigerant is non-conductive like glycol, you could suspend the CPU chip inside the glycol channel. Current CPU dies are ~ 200mm^2, with the vast majority of what we think of as a 'CPU' is mostly just polymer (epoxy, glass fiber or plastic) to secure the connections to a plate that allows socket insertion/soldering with less specialized machinery. $\endgroup$ Commented Jan 18, 2023 at 21:20
  • 1
    $\begingroup$ If you want an idea of how big CPU dies are, go an look at a microSIM or microSD card. Thats about how big a high end cpu die is. $\endgroup$ Commented Jan 18, 2023 at 21:32
  • 1
    $\begingroup$ Because at that point you effectively have a top mounted heat-sink, not pumping the heat through the silicon. Just mount your die on a liquid cooled heatsink with the passive side down, or use a thin layer of epoxy to separate the heatsink from the bondwires (How pretty much all liquid cooling systems work in practice) $\endgroup$ Commented Jan 19, 2023 at 18:28
1
$\begingroup$

There is nothing better in the animal kingdom than the human body for shedding excess heat. The human body can shed up to a kilowatt of heat, so getting rid of a measly 150 watts would be easy.

The problem comes if that heat is emitted directly in the brain in a point source, which may cause localised heat-related injury.

That gives us two possible ways to design such a device:

  1. Make the implant a distributed neural lace, that would sink its heat into the entire volume of the brain. Or:

  2. Put the actual processor elsewhere in the body, and design it to dump its waste heat into a major blood vessel, and simply implant its non-heat-emitting interface into the brain.

Now, I have a frame challenge:

Firstly, I should say that modern CPUs convert 100% of their power input into heat... but that instructions per watt have been going up as the size and efficiency of the processing elements (transistors or whatever) go down. In the future, it is pretty much a given that you'll get more instructions per watt than we do currently. The limitation is then not simply how to cool the co-processor, but the trade-off between the capabilities of the co-processor, the uses to which it can be applied and its heat output.

Secondly, computers only generate heat for actual computations, and an idling computer uses less power than a computer running at full capacity. Maybe this co-processor can generate up to 150w, but will it always do so? I think not.

Thirdly, what on earth is this co-processor going to be doing that can draw 150 watts of power? It sounds like it is going to be doing brute-force image and audio processing and graphics and audio rendering, pulling a video signal off the retina/optic nerve/optical cortex, processing it, and dumping a modified image back... and that may not be necessary. The human brain is believed to do all sorts of abstraction, so it may only be necessary to drop a signal onto the brain saying that 'you saw an x at y' rather than always processing a modified image. As a comparison, the human brain consumes a roughly constant 12w.

$\endgroup$
0
$\begingroup$

Unless what you mean when saying "AI" is ChatGPT or some other very advanced algorithm that is marketed as AI, then yeah, you have a problem, because computer powerful enough to to operate the type of a real AI would not fit inside of a skull.

On the other hand, if you have proper AI technology available, it is well established and improved with several generations of miniaturization, then neither power nor heat generation is a problem. Just make it molecular circuitry and just plant it, assuming it's generating half or quarter of a brain heat.

If human brain consumes 12W (some sources state up to 20W), cooling that is not a problem - we have big heads for a reason, as well as folds on a brain... Adding another 5W is no biggie as well, especially if place is well chosen.

Because it's not the heat that you need to dissipate is the biggest problem with your device, it's the device itself. With your assumptions it's near certain that the package will run hotter than 42 degrees Celsius, which will mean that it will literally fry human brain on it's own (or at least parts closest to the device).

Rethink your assumptions, because I don't think they are workable.

$\endgroup$
18
  • $\begingroup$ I say the processor for the brain would be designed to throttle down at 39C already, and/or run at frequencies lower than 0.5GHz in order to not generate too much heat while retaining decent performance. Also scaling wide could be employed with this processor, making several of them interconnected right under the skull would also provide fault tolerance. However yes, dissipating the question's 150W is not feasible as internal temperature would exceed 42C, but dissipating an extra 5W, or even 30W in a sixpack, might well be in range of an existing blood/lymph system of the brain. $\endgroup$
    – Vesper
    Commented Jan 17, 2023 at 9:00
  • $\begingroup$ @Vesper - my point is that running AI requires A LOT of computing power. You can throttle everything all you want, but then you will not get enough FLOPs to run a simple rote response bot, let alone anything resembling AI. On the other hand, if you have tech to run AI on a package small enough to be implanted, be certain you'd be well past the point of mundane need to cool "CPU" in it... $\endgroup$
    – AcePL
    Commented Jan 17, 2023 at 9:11
  • $\begingroup$ @AcePL nope - NNs are hugely inefficient, so realistically a real AGI would use much less power, if anything $\endgroup$
    – somebody
    Commented Jan 17, 2023 at 13:21
  • $\begingroup$ you think because of miniaturization they make a computer that operates on 0.5W of heat and costs $20? Great, I'll take 100 of them. Just shove them all in there. I'll be the smartest human alive. $\endgroup$ Commented Jan 17, 2023 at 14:11
  • $\begingroup$ @somebody - real AGI needs insane amount of computing capability - it needs to emulate rather broad spectrum of abilities humans do, so unless you can build an artificial brain that works similarly to mammalian brain all you can do is EMULATE with NNs. SO you have huge computing power requirement ON TOP of already serious power requirements. $\endgroup$
    – AcePL
    Commented Jan 17, 2023 at 14:54

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .