Kinda jumping into this after reading through posts and replies and speculations so I'm tagging the people I specifically am addressing so you all know I'm thinking of things you personally said while replying to the group.
First let me address the linguistic problem. Whether it's slang or jargon the simple fact is that language does a severely abysmal job providing the contextual clues children use in development to build cognitive links around the concept of intelligence.
Be warned, this will be a long post because it's a complex topic that aims to deal with a subject that is notoriously difficult for children to learn in the same way they learn everything else.
The concept is as diverse as the human brain and that particular object happens to be the single most complex physical object in the known universe (known by man at least if that isn't obvious).
I don't want to bore people so if you already know how amazing it is you can skip this next part. It's intended to just gush about how cool the brain is anyhow.
DATA DUMP
For any of you who might think I'm being hyperbolic, consider this: a microprocessor (current generation) is a single dye (piece of silicon) containing a gating bridge, at least two "cores", control logic and connections to L1 and L2 cache (L3 is only on dye in Xenon procs) and each core is hard-encoded instruction sets for about 12 different specially stacked instruction sets (dictionaries of operations) from the x86 architecture to MME and SSE architectures. Essentially all of them are 64x64 operator/operand intersections.
Even with all of that, it's still only binary operations where 64 bits define the operator (instruction set) to work on the other 64 bits (the working set) and the output (bit) can be 0 (neutral) or 1 (charged).
In comparison, the human brain has a sodium ion channel, a potassium ion channel, neurotransmitters (like amplifiers) and neural inhibitors (like resistors) and hormones (can activate or deactivate additional neural clusters as well as affect the release and uptake of tranmitters and inhibitors).
Those varied operations each act like 1 channel in a cpu by carrying one informational factor, so that's roughly 4 factors (equivalent to 8 possible values of 'bit').
Next, the human brain has evolved over time with stacked layers. These have names like occipital, temporal, prarietal, hypothalamus, limbic region, Broca's region, Wernicke reg... ok maybe I'm going too far.
The point is, each one of those is a large enough mass of neurons to be easily seen and held by the human eye and hand.
Packed into each is anywere from a few hundred million to a billon neurons.
Each neuron has a primary output (ganglial tail) and a cloud of synapses (like hairs on the other end) that bond them 1:1 or 1:n to other neurons (or even multiple times to a single neuron). Each one of those synapses and the ganglia transmit all 4 variables of state to their neighbors or possibly to a nerve cluster or to the lymbic region for connection to a whole other region of the brain.
So, instead of a 64bit processor you have a 4!*100 processor (2,400 channels). There are definitely signs that each neuron can even make some decisions on operations internally, not merely as part of the overall cluster, but I'm trying to simplify so I'll let that lye there. So right now we have about
24 billion possible configuration
channels on the low end to about
24 trillion for larger structures like the prefrontal cortex (and you have two of them).
That is the instruction set of the brain if you will. The operation set is the neuronal network itself (the series of connected and reinforced pathways between neurons.
Each operation instruction (all those trillions of configurations) take place over a series of fractional seconds while the 400Hz oscillating pulse (theta, beta, alpha waves) continues. That basically says that unlike a CPU the operator scales over time (then falls off) instead of initiating a state change instantly. The very analog nature of the operations, however, can be used to double the state of operations as a whole new operand too.
The inference from this "summation" of the complexity is that even though the brain operates at a
MUCH lower "clock speed" than a processor it still achieves many orders of magnitude faster processing because it covers its entire compiled data set 400 times per second while a modern CPU takes the better part of 45 seconds to process it's entire executable memory space and that is only about 2GB on average or about 18GB on a desktop running many applications.
Have you ever noticed your PC running very slowly while indexing files or when you have too many tabs open on the browser? Also your computer has all that information in descrete and unassociated chunks while the human brain has to hang everything it knows on an associated framework.
NOTE: The whole reason people can't remember memories before about 2 or 3 years of age is because your entire adult memory space is hung on a compatible framework that developed about that time. The things that came before just don't make sense any longer.
BREAK DATA DUMP
As I've mentioned in other posts, the study of the human mind as an empirical discipline (said with a huge amount of tongue-in-cheek) has only been around since about the 1930s so it's a very very very new "science" and a whole lot of it isn't science at all due to some problems during the cold war leading up to a total ban on many (most) forms of experimentation in the West since the 70s as unethical/inhumane. Of course that ban hasn't done a whole lot of our knowledge of how the brain works and develops, but there is still progress if slow.
One fantastic (and very new) bit of research done is about how information absorption affects perceived intelligence.
Your brain needs you to read - Pocket Article. (study is referenced in the article, I can link it if anyone wants).
I'm going to take a break here because this is already long and I have other things I need to do.