That’s capturing everything. Ultimately you need only a tiny fraction of that data to emulate the human brain.
Numenta is working on a brain model to create functional sections of the brain. Their approach is different though. They are trying to understand the components and how they work together and not just aggregating vast amounts of data.
Of course, not to say the data isn’t also important though. It’s very possible that we’re missing something crucial regarding how the brain functions, despite everything we know so far. The more data we have, the better we can build/test these more streamlined models.
These models would likely be tested against these real datasets, so they help each other.
No it does not. It captures only the physical structures. There’s also chemical and electrical state that’s missing.
I don’t think any simplified model can work EXACTLY like the real thing. Ask rocket scientists
Given the prevalence of intelligence in nature using vastly different neurons I’m not sure if you even need to have an exact emulation of the real thing to achieve the same result.
Ultimately you need only a tiny fraction of that data to emulate the human brain.
I am curious how that conclusion was formed as we have only recently discovered many new types of functional brain cells.
While I am not saying this is the case, that statement sounds like it was based on the “we only use 10% of our brain” myth, so that is why I am trying to get clarification.
They took imaging scans, I just took a picture of a 1MB memory chip and omg my picture is 4GB in RAW. That RAM the chip was on could take dozens of GB!
“We did the back-of-napkin math on what ramping up this experiment to the entire brain would cost, and the scale is impossibly large — 1.6 zettabytes of storage costing $50 billion and spanning 140 acres, making it the largest data center on the planet.”
Look at what they need to mimic just a fraction of our power.
Heh, we the best. Now excuse me while I use my amazing brainpower to watch questionable anime and get more depressed 🫥
Every day we stray further from God’s light lol
In fairness, the scan required such astronomical resources because of how they were scanning it. They took the cubic millimeter chunk and cut it into 5,000 super thin flat slices and then did extremely high detail scans of each slice. That’s why they needed AI, to try and piece those flat layers back together into some sort of 3D structure.
Once they have the 3D structure, the scans are useless and can be deleted.
In time it should be possible to scan the tissue and get the 3D structure without such extreme data use.
And the whole human body, brain and all, can run on ~100 watts. Truly astounding.
Yet with those 100 watts, the brain cannot model itself in this detail - even though it is the literal embodiment of it! A strange thing to consider.
Meh. No different than how you can make a programming language and then have it compile itself. It’s weird.
Shouldn’t be long and we will have that much in our phones.
Storage vendors are rolling their hands in delight while systems administrators, particularly backup admins are cringing at the thought.
“Google to shutter human brains”
Why anyone teams up with a company with its greatest achievement being a high score on the “I wish they hadn’t shut that down” list, is beyond my understanding.
Because almost nobody else has enough money for such research and governments won’t pay for it because it’s not very useful